Author: Quan Chen, Xue Feng, Chunhui Han, Gaofeng Huang, Trevor Ketcherside, Yi Lao, Yun Rose Li, An Liu, Bo Liu, Kun Qing, William T. Watkins π¨βπ¬
Affiliation: Graduate Program in Bioengineering, University of California San Francisco-UC Berkeley, Department of Radiation Oncology, City of Hope National Medical Center, Mayo Clinic Arizona, Carina Medical LLC π
Purpose: New treatment platforms such as Ethos (Varian Medical Systems) allow the introduction of multi-modal imaging into adaptive radiotherapy workflow to facilitate an up-to-date view of patientsβ anatomy. This is particularly important for disease sites in the pelvis and abdomen where the treatment target is surrounded by organs changing rapidly. However, existing commercial software does not provide consistently accurate multimodality image registration. Without automation, it is very time-consuming to use information from a different imaging modality. The purpose of this work is to develop a new multimodality deformable image registration method and test its utility in real treatment scenario.
Methods: Two prostate bed patients were selected for this study. Each patient underwent an initial treatment of 45 Gy followed by boost treatment up to 14 fractions all in 1.8Gy per fraction. Initial MRI images were registered to CBCT images every two weeks during the treatment course (5 CBCT image sets for each patient). Our registration method aligns images based on the auto-segmented contours from MRI and CT images using U-NET [1]. As an initial test, only bladder contours are used. A rigid registration was performed first. The deformable registration starts with the results of the rigid registration and applies explicit b-spline regularization [2] based on selected contours (bladder) to obtain good deformation fields across the entire image. Manual registration by experienced physicians is used as ground truth. A conventional registration method based on mutual information provided by ANTs [3] is used for comparison.
Results: Our rigid and deformable registration by U-NET yielded significantly smaller registration errors than the ANTs-based method (P<0.001). The U-NET deformable registration method provided prostate bed contours closer to the ground truth contours, but not statistically significant.
Conclusion: Automated multimodality image registration using U-NET based segmentation is a highly accurate method with great potential for clinical application.