• AI
  • Molecular Imaging
  • CT
  • X-Ray
  • Ultrasound
  • MRI
  • Facility Management
  • Mammography

Monitors need monitoring

Article

The weakest link in the digital imaging chain is often the last: the display station itself. While standards such as the DICOM Gray Scale Display Function have emerged to calibrate monitor performance, this was not designed to address ambient lighting levels, maximum monitor luminance, luminance ratio, luminance uniformity, or spatial resolution.

The weakest link in the digital imaging chain is often the last: the display station itself. While standards such as the DICOM Gray Scale Display Function have emerged to calibrate monitor performance, this was not designed to address ambient lighting levels, maximum monitor luminance, luminance ratio, luminance uniformity, or spatial resolution.

Two recent British studies explore calibration considerations for viewing soft-copy images.

The first assessed the performance of primary clinical display monitors in use in a large acute National Health Service radiology department using methods and guidelines described by the American Association of Physicists, Medicine Task Group 18 (Br J Radiol 2007;80;256-260).

"Nearly 30% (four of 14) of the display monitors failed to meet at least one of the test's guideline tolerances," said David Thompson, a medical physicist at the University Hospital of North Staffordshire in the U.K.

Two monitors had their brightness levels reduced as a means of prolonging their life. A number of monitors were found to be operating at settings that might reduce their useful life span. These devices were either replaced or recalibrated by the installers, or they were subject to local adjustment to ensure applicable standards were met, Thompson said.

Ambient light has been shown to be an important parameter when determining the ability of radiologists to detect low-contrast image details. Excessive background light levels and extraneous light sources can illuminate the display surface.

The Thompson study suggests that quality assurance testing of display monitors used for image reporting is necessary to ensure that images are viewed at an appropriate standard. Thompson said each monitor evaluation takes about 20 minutes to complete.

"Imaging departments with limited resources may find the task of initiating a suitable quality assurance program somewhat daunting, but consideration should be given to the results of this study," Thompson said.

The second study also looked at workstation calibration issues (Br J Radiol 2007;80;503-507).

"Specifications for devices used for display of soft-copy images are not currently well defined, nor are the requirements for optimal set-up and quality assurance," said David S. Brettle, Ph.D., chief of radiological physics at Leeds General Infirmary.

Brettle's paper evaluates the current situation and offers potential hospital-wide solutions. One solution he presented in an earlier paper is a psychophysical check to ensure optimal display performance before viewing software can be launched (Br J Radiol 2005;78;749-751). This check is in the form of a challenge-response test based on the threshold of detection that could also serve as a portal to launch image viewing software.

Related Videos
Can Fiber Optic RealShape (FORS) Technology Provide a Viable Alternative to X-Rays for Aortic Procedures?
Does Initial CCTA Provide the Best Assessment of Stable Chest Pain?
Making the Case for Intravascular Ultrasound Use in Peripheral Vascular Interventions
Can Diffusion Microstructural Imaging Provide Insights into Long Covid Beyond Conventional MRI?
Assessing the Impact of Radiology Workforce Shortages in Rural Communities
Emerging MRI and PET Research Reveals Link Between Visceral Abdominal Fat and Early Signs of Alzheimer’s Disease
Reimbursement Challenges in Radiology: An Interview with Richard Heller, MD
Nina Kottler, MD, MS
The Executive Order on AI: Promising Development for Radiology or ‘HIPAA for AI’?
Related Content
© 2024 MJH Life Sciences

All rights reserved.