- intraoperative radiotherapy
- output variations
- x‐ray source commissioning
INTRABEAM x‐ray sources (XRSs) have distinct output characteristics due to subtle variations between the ideal and manufactured products. The objective of this study is to intercompare 15 XRSs and to dosimetrically quantify the impact of manufacturing variations on the delivered dose.
The normality of the XRS datasets was evaluated with the Shapiro–Wilk test, the accuracy of the calibrated depth–dose curves (DDCs) was validated with ionization chamber measurements, and the shape of each DDC was evaluated using depth–dose ratios (DDRs). For 20 Gy prescribed to the spherical applicator surface, the dose was computed at 5‐mm and 10‐mm depths from the spherical applicator surface for all XRSs.
At 5‐, 10‐, 20‐, and 30‐mm depths from the source, the coefficient of variation (CV) of the XRS output for 40 kVp was 4.4%, 2.8%, 2.0%, and 3.1% and for 50 kVp was 4.2%, 3.8%, 3.8%, and 3.4%, respectively. At a 20‐mm depth from the source, the 40‐kVp energy had a mean output in Gy/Minute = 0.36, standard deviation (SD) = 0.0072, minimum output = 0.34, and maximum output = 0.37 and a 50‐kVp energy had a mean output = 0.56, SD = 0.021, minimum output = 0.52, and maximum output = 0.60. We noted the maximum DRR values of 2.8% and 2.5% for 40 kVp and 50 kVp, respectively. For all XRSs, the maximum dosimetric effect of these variations within a 10‐mm depth of the applicator surface is ≤ 2.5%. The CV increased as depth increased and as applicator size decreased.
The American Association of Physicist in Medicine Task Group‐167 requires that the impurities in radionuclides used for brachytherapy produce ≤ 5.0% dosimetric variations. Because of differences in an XRS output and DDC, we have demonstrated the dosimetric variations within a 10‐mm depth of the applicator surface to be ≤ 2.5%.