Deep Learning-Based Eye Monitoring and Tracking System for Ocular Proton Therapy in a Regular Gantry Room šŸ“

Author: David H. Abramson, Christopher Barker, Jasmine H. Francis, Meng Wei Ho, Yen-Po Lee, Haibo Lin, Hang Qi, Andy Shim, Charles B. Simone, Weihong Sun, Xiaoxuan Xu, Siyu Yang, Francis Yu, Anna Zhai šŸ‘Øā€šŸ”¬

Affiliation: College of Machine Intelligence, Nankai University, New York Proton Center, Department of Biomedical Engineering, Johns Hopkins University, Department of Radiation Oncology, Memorial Sloan Kettering Cancer Center, Memorial Sloan Kettering Cancer Center šŸŒ

Abstract:

Purpose: Proton therapy is an effective modality for treating ocular tumors such as uveal melanoma. We developed a novel camera‐based eye‐tracking system to provide real-time noninvasive eye positioning and motion monitoring. The system track pupil coordinates, while the newly developed 3-D model dynamically simulates eye movement derived from pupil motion, quantifying corresponding variations in target coverage and potential dosimetric impact.

Methods: An infrared camera and light source point for pupil were employed. A deep learning workflow (Mediapipe-FaceMesh for eye localization, TensorFlow for pupil detection) calculates eye real-time coordinates. Based on planning CT and target contour, the Python routine reconstructs a 3D eyeball model and the clinical target volume (CTV), mapping the 2D pupil movements onto a patient‐specific ellipsoidal geometry. Besides the 2D displacements of pupil, the system computes target localization and compares it to prescription dose cloud. Daily cone‐beam CT and kV imaging confirm the patient’s alignment and provide a baseline for the tracking system. A tolerance‐based alert issues warnings when deviations exceed predefined thresholds.

Results: The enhanced workflow achieves a 3D reconstruction accuracy of ≤0.2 mm, validated via daily CBCT and kV imaging. Using physical information in the CBCT as calibration tools to align the eye and CTV in a shared 3D coordinate system, ensuring that sub‐millimetric changes in ocular position are consistently detected. The system reliably measures displacements up to 1.4 mm (vs 3 mm planning margin) for the first test patient, with a 0.5-second response time. Although minor eye motions were observed, 2D pupil tracking and 3D target-position monitoring system confirmed the robust treatment during each beam delivery.

Conclusion: This in‐house camera‐based system tracks real‐time ocular motion and offers 3D prediction on target coverage for proton therapy, maintaining high‐accuracy alignment throughout treatment. Leveraging deep learning and an ellipsoidal 3D model, enable noninvasive, sub‐millimetric monitoring of tumor position.

Back to List