Abstract
Purpose
To propose a framework for synergistic reconstruction of PET‐MR and multi‐contrast MR data to improve the image quality obtained from noisy PET data and from undersampled MR data.
Theory and Methods
Weighted quadratic priors were devised to preserve common boundaries between PET‐MR images while reducing noise, PET Gibbs ringing, and MR undersampling artifacts. These priors are iteratively reweighted using normalized multi‐modal Gaussian similarity kernels. Synergistic PET‐MR reconstructions were built on the PET maximum a posteriori expectation maximization algorithm and the MR regularized sensitivity encoding method. The proposed approach was compared to conventional methods, total variation, and prior‐image weighted quadratic regularization methods. Comparisons were performed on a simulated [18F]fluorodeoxyglucose‐PET and T1/T2‐weighted MR brain phantom, 2 in vivo T1/T2‐weighted MR brain datasets, and an in vivo [18F]fluorodeoxyglucose‐PET and fluid‐attenuated inversion recovery/T1‐weighted MR brain dataset.
Results
Simulations showed that synergistic reconstructions achieve the lowest quantification errors for all image modalities compared to conventional, total variation, and weighted quadratic methods. Whereas total variation regularization preserved modality‐unique features, this method failed to recover PET details and was not able to reduce MR artifacts compared to our proposed method. For in vivo MR data, our method maintained similar image quality for 3× and 14× accelerated data. Reconstruction of the PET‐MR dataset also demonstrated improved performance of our method compared to the conventional independent methods in terms of reduced Gibbs and undersampling artifacts.
Conclusion
The proposed methodology offers a robust multi‐modal synergistic image reconstruction framework that can be readily built on existing established algorithms.
To propose a framework for synergistic reconstruction of PET‐MR and multi‐contrast MR data to improve the image quality obtained from noisy PET data and from undersampled MR data.
Theory and Methods
Weighted quadratic priors were devised to preserve common boundaries between PET‐MR images while reducing noise, PET Gibbs ringing, and MR undersampling artifacts. These priors are iteratively reweighted using normalized multi‐modal Gaussian similarity kernels. Synergistic PET‐MR reconstructions were built on the PET maximum a posteriori expectation maximization algorithm and the MR regularized sensitivity encoding method. The proposed approach was compared to conventional methods, total variation, and prior‐image weighted quadratic regularization methods. Comparisons were performed on a simulated [18F]fluorodeoxyglucose‐PET and T1/T2‐weighted MR brain phantom, 2 in vivo T1/T2‐weighted MR brain datasets, and an in vivo [18F]fluorodeoxyglucose‐PET and fluid‐attenuated inversion recovery/T1‐weighted MR brain dataset.
Results
Simulations showed that synergistic reconstructions achieve the lowest quantification errors for all image modalities compared to conventional, total variation, and weighted quadratic methods. Whereas total variation regularization preserved modality‐unique features, this method failed to recover PET details and was not able to reduce MR artifacts compared to our proposed method. For in vivo MR data, our method maintained similar image quality for 3× and 14× accelerated data. Reconstruction of the PET‐MR dataset also demonstrated improved performance of our method compared to the conventional independent methods in terms of reduced Gibbs and undersampling artifacts.
Conclusion
The proposed methodology offers a robust multi‐modal synergistic image reconstruction framework that can be readily built on existing established algorithms.
Original language | English |
---|---|
Pages (from-to) | 2120-2134 |
Number of pages | 15 |
Journal | Magnetic Resonance in Medicine |
Volume | 81 |
Issue number | 3 |
Early online date | 16 Oct 2018 |
DOIs | |
Publication status | Published - Mar 2019 |
Keywords
- Multi-modal imaging
- PET-MRI
- synergistic reconstruction