• AI
  • Molecular Imaging
  • CT
  • X-Ray
  • Ultrasound
  • MRI
  • Facility Management
  • Mammography

Monitor aging degrades CRT luminance

Article

Long-term usage of individual CRT monitors induces gradual deterioration in luminance, resulting in a detrimental effect on radiologist detection performance, according to researchers in Japan. The work confirms an earlier, flawed, study by the same group.

Long-term usage of individual CRT monitors induces gradual deterioration in luminance, resulting in a detrimental effect on radiologist detection performance, according to researchers in Japan. The work confirms an earlier, flawed, study by the same group.

As reported earlier this year (Comput Med Imaging Graph 2005;29(1):35-41), researchers at Nagoya University School of Medicine reverified that monitor age is one of the most common image quality issues. Study design was the culprit hindering the earlier work (Invest Radiol 2003;38(1):57-63).

In the more recent study, the researchers performed four-alternative forced choice (4-AFC) experiments with 11 luminance conditions simulating CRT degradation.

Six radiologists with a mean experience of 17 years and one pulmonary specialist performed a detection task for 11,000 test areas on 110 test images. Subjects were asked to indicate either the quadrant most likely to have contained the target or that no target was seen for each test area.

The study showed that in conditions in which the maximum luminance of the CRT was 66.7% or below that of standard display luminance, the number of detected nodules deteriorated reliably. The threshold in the first study was 60.7%.

In all experiments, only default standard image processing for chest image interpretation was used, and no other image processing functions were used. Observers were not allowed to alter these preset image processing functions during the reading sessions. Viewing time was unlimited, and the ambient illumination was 200 lux during the viewing sessions.

Researchers recommend that luminance measurements be made periodically on every CRT monitor to ensure that maximum and minimum values are within predetermined tolerance.

"Adjustment of luminance on CRTs used for clinical image diagnoses should be made to maintain luminance so that maximum luminance does not fall below 66.7% of standard condition," said Dr. Mitsuru Ikeda, an associate professor of radiology at Nagoya.

In the earlier study, observation order in display luminance was always either from brightest condition to darkest, or the reverse, so study results were biased, Ikeda said. In the follow-up work, an experimental design using a random observation order of images of varying display luminance was necessary.

"Luminance changes in CRT displays were the same as in our previous study, in which grid voltage was decreased," Ikeda said.

Related Videos
Improving the Quality of Breast MRI Acquisition and Processing
Can Fiber Optic RealShape (FORS) Technology Provide a Viable Alternative to X-Rays for Aortic Procedures?
Does Initial CCTA Provide the Best Assessment of Stable Chest Pain?
Making the Case for Intravascular Ultrasound Use in Peripheral Vascular Interventions
Can Diffusion Microstructural Imaging Provide Insights into Long Covid Beyond Conventional MRI?
Assessing the Impact of Radiology Workforce Shortages in Rural Communities
Emerging MRI and PET Research Reveals Link Between Visceral Abdominal Fat and Early Signs of Alzheimer’s Disease
Reimbursement Challenges in Radiology: An Interview with Richard Heller, MD
Nina Kottler, MD, MS
The Executive Order on AI: Promising Development for Radiology or ‘HIPAA for AI’?
Related Content
© 2024 MJH Life Sciences

All rights reserved.