An Approach for Live Motion Correction for TRUS-MR Prostate Fusion Biopsy using Deep Learning

Annu Int Conf IEEE Eng Med Biol Soc. 2021 Nov:2021:2993-2996. doi: 10.1109/EMBC46164.2021.9630254.

Abstract

TRUS-MR fusion guided biopsy highly depends on the quality of alignment between pre-operative Magnetic Resonance (MR) image and live trans-rectal ultrasound (TRUS) image during biopsy. Lot of factors influence the alignment of prostate during the biopsy like rigid motion due to patient movement and deformation of the prostate due to probe pressure. For MR-TRUS alignment during live procedure, the efficiency of the algorithm and accuracy plays an important role. In this paper, we have designed a comprehensive framework for fusion based biopsy using an end-to-end deep learning network for performing both rigid and deformation correction. Both rigid and deformation correction in one single network helps in reducing the computation time required for live TRUS-MR alignment. We have used 6500 images from 34 subjects for conducting this study. Our proposed registration pipeline provides Target Registration Error (TRE) of 2.51 mm after rigid and deformation correction on unseen patient dataset. In addition, with a total computation time of 70ms, we are able to achieve a rendering rate of 14 frames per second (FPS) that makes our network well suited for live procedures.Clinical Relevance- It is shown in the literature that systematic biopsy is the standard method for biopsy sampling in prostate that has high false negative rates. TRUS-MR fusion guided biopsy reduces the false negative rate of the sampling in prostate biopsy. Therefore, a live TRUS-MR fusion framework is helpful for prostate biopsy clinical procedures.

MeSH terms

  • Deep Learning*
  • Humans
  • Image-Guided Biopsy
  • Male
  • Prostate / diagnostic imaging
  • Prostatic Neoplasms* / diagnostic imaging
  • Ultrasonography