Researchers at the Cincinnati Children’s Hospital Medical Center have developed new imaging software that allowed radiologists to reduce the amount of radiation patients were exposed to by 37 percent, according to two studies recently published in Radiology.
“All new CT scanners now use automatic tube current modulation, but we found that the image quality the ATCM produces does not match desired image quality over a range of patient sizes,” said David Larson, MD, radiation quality and safety director at the medical center and principal architect of the new technology. “Therefore, in order to try to optimize radiation dose, radiologists must create different protocols for different patient size ranges.”
The new software allows radiologists to mathematically determine the lowest possible radiation dose for the patient, while still producing diagnostic-quality images.
Developing, Validating Model
In the first study, Larson and colleagues worked to develop this software by creating several models: a model measuring water-equivalent diameter based on the topogram; a model for estimating image noise and size-specific dose estimates; and a model to quantify radiologist image quality preferences.
The water-equivalent diameter model was validated on each axial section in eight CT exams of the abdomen, chest and pelvis. Results showed that the mean percentage difference between topogram-based and axial-based size estimates water-equivalent diameter was –3.5% ± 2.2.
The model for image noise and size-specific dose was validated in 16 examinations of anthropomorphic phantoms. Results showed that using the model, noise was under predicted by approximately 0.86 HU ± 0.68 (P<.01), equivalent to 6.9 percent of the measured noise. When measuring volume CT dose index using the model, the mean difference between the model estimate and reported value was 0.8 percent ± 1.8.
Finally, the image quality model was validated in 32 CT examinations of the abdomen and pelvis by 10 radiologists. The mean differences between predicted and actual effective tube current-time product, size-specific dose estimates, and estimated image noise were –0.9 percent ± 9.3, –1.8 percent ± 10.6, –0.5 percent ± 4.4, respectively.
Quality Improvement Study
In the second study, Larson and colleagues asked radiologists to score CT images to determine an acceptable amount of image noise. The researchers included 817 CT examinations: 490 acquired before the protocol changes, and 327 acquired after the protocol changes.
“We were able to significantly decrease the variation in CT image quality and radiation dose without impacting the workflow,” Larson told Diagnostic Imaging. “The radiologists also would not have known that there was a change if we had not told them, except that they noticed that the image quality was more consistent than it had been in the past.”
The researchers observed a small increase in overall image noise. The difference between actual image noise and target noise increased from –1.4 HU to 0.3 HU (P<.01). In addition, the standard deviation decreased from 3.9 HU to 1.6 HU (P<.01).
The protocol also resulted in a significant decrease in the mean size-specific dose estimate from 11.9 mGy to 7.5 mGy, a difference of 36.2% (P<.01).
“This technology translates what currently is a very difficult and time-consuming task into one that happens almost completely automatically,” Larson said. In addition, because the model is based on size and not age, it is likely that it came be applied widely, in adults and children.
Larson said the next step is to develop and validate the software for other scanners for use at other sites.