Purpose: One of the promising options for motion management in radiation therapy (RT) is the use of LINAC-compatible robotic-arm-mounted ultrasound imaging system due to its high soft tissue contrast, real-time capability, absence of ionizing radiation, and low cost. The purpose of this work is to develop a novel deep learning-based real-time motion tracking strategy for ultrasound image-guided RT.
Methods: The proposed tracker combined the attention-aware fully convolutional neural network (FCNN) and the convolutional long short-term memory network (CLSTM) that is end-to-end trainable. The glimpse sensor module was built inside the attention-aware FCNN to discard majority of background by focusing on a region containing the object of interest. FCNN extracted discriminating spatial features of glimpse to facilitate temporal modeling for CLSTM. The saliency mask computed from CLSTM refined the features particular to the tracked landmarks. Moreover, the multitask loss strategy including bounding box loss, localization loss, saliency loss, and adaptive loss weighting term was utilized to facilitate training convergence and avoid over/underfitting. The tracker was tested on the databases provided by MICCAI 2015 challenges, and the ground truth data were obtained with the help of brute force-based template matching technology.
Results: The mean tracking error of 0.97 ± 0.52 mm and maximum tracking error of 1.94 mm were observed for 85 point landmarks across 39 ultrasound cases compared to the ground truth annotations. The tracking speed per frame per landmark with the GPU implementation ranged from 66 and 101 frames per second, which largely exceeded the ultrasound imaging rate.
Conclusion: The results demonstrated the robustness and accuracy of the proposed deep learning-based motion estimation, despite the existence of some known shortcomings of ultrasound imaging such as speckle noise. The tracking speed of the system was found to be remarkable, sufficiently fast for real-time applications in RT environment. The approach provides a valuable tool to guide RT treatment with beam gating or multileaf collimator (MLC) tracking in real time.
Keywords: convolution neural network; deep learning; motion management; recurrent neural network; ultrasound tracking.
© 2019 American Association of Physicists in Medicine.