Determining the Optimal CT Tube Voltage for Imaging Bone Structures in Radiotherapy. 📝

Author: Mohamed Bahaaeldin Mohamed Afifi, Nahla Nagy Ahmad Ataalla, Ahmed A. Eldib 👨‍🔬

Affiliation: Fox Chase Cancer Center, Radiological Sciences and Medical Imaging Department, College of Applied Medical Sciences, Prince Sattam bin Abdulaziz University. 🌍

Abstract:

Purpose: Optimizing CT imaging parameters is crucial for balancing radiation dose, contrast resolution, and accurate Hounsfield unit representation, particularly in radiotherapy treatment planning. This study investigates the optimal tube voltage for bone imaging in radiotherapy.
Methods: We utilized the CIRS Model-062M plastic phantom comprising nine inserts with varying electron densities. CT scans were performed using 80kV and 140kV tube voltages with a consistent current of 300mA. The resulting images were analyzed using the Eclipse treatment planning system (Version 16.1, Varian). Metrics included the Dose Attenuation Coefficient Percent (DACP) to evaluate beam attenuation variability and the Equivalent Path Length (EPL) to assess bone density accuracy. Calculated EPLs were compared with TPS-derived values for validation.
Results: EPL values displayed tube voltage dependence, with 140kV yielding superior alignment with TPS-derived values and manufacturer-provided electron density data. In contrast, 80kV imaging introduced greater uncertainty in CT number accuracy.
Conclusion: The results advocate for the use of 140kV in CT bone imaging for radiotherapy, given its enhanced accuracy in Hounsfield unit representation and alignment with clinical treatment planning requirements. Future research should explore additional factors influencing CT imaging accuracy, such as reconstruction algorithms.

Back to List