3D Unsupervised deep learning method for magnetic resonance imaging-to-computed tomography synthesis in prostate radiotherapy

Phys Imaging Radiat Oncol. 2024 Jul 19:31:100612. doi: 10.1016/j.phro.2024.100612. eCollection 2024 Jul.

Abstract

Background and purpose: Magnetic resonance imaging (MRI)-to-computed tomography (CT) synthesis is essential in MRI-only radiotherapy workflows, particularly through deep learning techniques known for their accuracy. However, current supervised methods are limited to specific center's learnings and depend on registration precision. The aim of this study was to evaluate the accuracy of unsupervised and supervised approaches in the context of prostate MRI-to-CT generation for radiotherapy dose calculation.

Methods: CT/MRI image pairs from 99 prostate cancer patients across three different centers were used. A comparison between supervised and unsupervised conditional Generative Adversarial Networks (cGAN) was conducted. Unsupervised training incorporates a style transfer method with. Content and Style Representation for Enhanced Perceptual synthesis (CREPs) loss. For dose evaluation, the photon prescription dose was 60 Gy delivered in volumetric modulated arc therapy (VMAT). Imaging endpoint for sCT evaluation was Mean Absolute Error (MAE). Dosimetric endpoints included absolute dose differences and gamma analysis between CT and sCT dose calculations.

Results: The unsupervised paired network exhibited the highest accuracy for the body with a MAE at 33.6 HU, the highest MAE was 45.5 HU obtained with unsupervised unpaired learning. All architectures provided clinically acceptable results for dose calculation with gamma pass rates above 94 % (1 % 1 mm 10 %).

Conclusions: This study shows that multicenter data can produce accurate sCTs via unsupervised learning, eliminating CT-MRI registration. The sCTs not only matched HU values but also enabled precise dose calculations, suggesting their potential for wider use in MRI-only radiotherapy workflows.

Keywords: Perceptual loss; Synthetic CT; Unsupervised learning; cGAN.