Digital radiography systems are in common use for medical imaging, yet few studies have examined computed radiography quality performance in terms of reject rates. This lack is due primarily to difficulty in obtaining the data required to calculate reject statistics.
Digital radiography systems are in common use for medical imaging, yet few studies have examined computed radiography quality performance in terms of reject rates. This lack is due primarily to difficulty in obtaining the data required to calculate reject statistics.
"This problem has been further compounded by the lack of software infrastructure necessary to centrally compile data for radiology departments that have multiple digital capture devices," said David H. Foos of the Clinical Applications Research Laboratory at Carestream Health.
Foos recently described a methodology used to compile a comprehensive database consisting of more than 288,000 CR images from two all-digital radiology departments in a university hospital and a large community hospital in order to perform reject analysis (J Digit Imaging 2008;Apr 30 [Epub ahead of print]). Each database record contains image information such as body part and view position, exposure level, technologist identifier, and reason for rejection.
"Our results suggest that complete and accurate reject analysis in digital radiography is possible, but it requires that quality assurance software tools and associated network infrastructure be combined with rigorous data entry protocols to ensure the integrity of the analysis," Foos said.
Accurate reject analysis provides a platform from which to develop targeted training programs designed to mitigate the largest sources of patient repeat exposures, he said.
The Foos study found the reject rate for CR across both hospital departments and all exam types was 4.4% at the university and 4.9% at the community hospital.
"Positioning errors and anatomy cutoff were the most frequently occurring reasons for rejection, accounting for 45% of rejects at the community hospital and 56% at the university," Foos said.
The next most frequently cited reasons for reject were improper exposure (14% at the community hospital and 13% at the university), followed by patient motion (11% community and 7% university).
"Comprehensive digital radiography QA requires that a mechanism be put into place to force technologists to enter reject data information into a database, preferably before another image can be scanned," Foos said.
He also suggested that a standardized lexicon for reject reasons be organized to eliminate inconsistencies in reject data entry.
"Standardized terminology and definitions for QA deficiencies must be established, along with the associated training, to eliminate inconsistent and inappropriate labeling of rejected images," he said.
Tools such as digital dashboards and device clustering software platforms are now available to assist with QA efforts, Foos said.
"These tools facilitate access to the objective data necessary to analyze and report on reject statistics and digital radiography equipment utilization across an entire institution," he said.
The Reading Room Podcast: Current Perspectives on the Updated Appropriate Use Criteria for Brain PET
March 18th 2025In a new podcast, Satoshi Minoshima, M.D., Ph.D., and James Williams, Ph.D., share their insights on the recently updated appropriate use criteria for amyloid PET and tau PET in patients with mild cognitive impairment.
Can Photon-Counting CT be an Alternative to MRI for Assessing Liver Fat Fraction?
March 21st 2025Photon-counting CT fat fraction evaluation offered a maximum sensitivity of 81 percent for detecting steatosis and had a 91 percent ICC agreement with MRI proton density fat fraction assessment, according to new prospective research.