Deep learning for dual-energy X-ray computed tomography

He Yang, Wenxiang Cong, Ge Wang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Dual energy computed tomography (CT) utilizes two different X-ray source spectra to enhance the capability of material differentiation, which is a significant advantage over conventional single source CT. In this paper, we propose a deep learning technique that can in principle produce mono-energetic sinogram of any energy given dual-energy sinogram measurements. In particular, we develop a convolutional neural network (CNN) to link dual-energy CT sinograms to mono-energetic sinograms. By training image patches from ground truth datasets, our CNN can successfully convert a pair of dual-energy CT measurements to a high quality mono-energy sinogram. We also find that suitable choice of loss function and learning rate is very crucial for the image quality of our CNN prediction. By exploring extensive numerical simulations, we show that l1 loss function outperforms l2 loss function for our CNN. We also compare different learning rates and find optimal value for our simulations. This is the first application of deep learning technique in dual-energy CT.
Original languageEnglish (US)
Title of host publicationProceedings of The 14th International Meeting on Fully Three-Dimensional Image Reconstruction in Radiology and Nuclear Medicine
Pages864-869
Number of pages6
StatePublished - 2017

Keywords

  • Dual-energy CT
  • Deep learning
  • Convolutional Neural Networks
  • loss function
  • Learning rate

ASJC Scopus subject areas

  • Biomedical Engineering

Fingerprint Dive into the research topics of 'Deep learning for dual-energy X-ray computed tomography'. Together they form a unique fingerprint.

  • Cite this

    Yang, H., Cong, W., & Wang, G. (2017). Deep learning for dual-energy X-ray computed tomography. In Proceedings of The 14th International Meeting on Fully Three-Dimensional Image Reconstruction in Radiology and Nuclear Medicine (pp. 864-869)