• AI
  • Molecular Imaging
  • CT
  • X-Ray
  • Ultrasound
  • MRI
  • Facility Management
  • Mammography

Researchers turn to amateurs for workstation design evaluation

Article

New radiology workstation design can be tested and validated using inexperienced laypersons and look-alike radiological tasks, according to researchers at Simon Fraser University.

New radiology workstation design can be tested and validated using inexperienced laypersons and look-alike radiological tasks, according to researchers at Simon Fraser University.

Evaluation of the design for a new radiology workstation typically requires a user study involving several radiologists who perform diagnoses with the new hardware and software.

It is possible to create tasks that mimic how radiologists commonly use a workstation, however, according to a paper in the June issue of the Journal of Digital Imaging (J Digit Imaging 2005;18(2):109-115). Researchers can then test new design features by observing nonradiologists completing the radiology-like tasks on stripped-down workstations, rather than conducting more costly user studies involving radiologists.

"Very little previous research has been performed on radiology workstation design testing, yet proper workstation design can lead to significant benefits in the efficacy of the radiologist," said M. Stella Atkins, Ph.D., a professor of computing science at Fraser.

Atkins and colleagues evaluated two different workstation interaction techniques with both laypersons and radiologists using a set of artificial targets to simulate the reading of a diagnostic examination.

Atkins first observed radiologists at work and then designed a task and a set of stimuli that allowed her to simulate interpretation workflow using a typical task: identifying anatomic abnormalities in a projection radiology chest reading scenario.

"Looking at mouse clicks, response times, and interpretation errors, we found both groups were very similar. All the subjects mastered the style of interaction in a similar way," she said.

The next step is to extend these tests to include eye-tracking, so that researchers can monitor and record the fixations of radiologists to determine where they are looking while they're performing tasks. This type of study would help determine causes of errors and establish how much time is spent viewing the workstation controls, Atkins said.

Atkins also plans to evaluate other measures for optimizing the user interface, exploring how to introduce new tools to avoid visual distraction.

Recent Videos
Employing AI in Detecting Subdural Hematomas on Head CTs: An Interview with Jeremy Heit, MD, PhD
Pertinent Insights into the Imaging of Patients with Marfan Syndrome
What New Brain MRI Research Reveals About Cannabis Use and Working Memory Tasks
Current and Emerging Legislative Priorities for Radiology in 2025
How Will the New FDA Guidance Affect AI Software in Radiology?: An Interview with Nina Kottler, MD, Part 2
A Closer Look at the New Appropriate Use Criteria for Brain PET: An Interview with Phillip Kuo, MD, Part 2
How Will the New FDA Guidance Affect AI Software in Radiology?: An Interview with Nina Kottler, MD, Part 1
A Closer Look at the New Appropriate Use Criteria for Brain PET: An Interview with Phillip Kuo, MD, Part 1
Teleradiology and Breast Imaging: Keys to Facilitating Personalized Service, Efficiency and Equity
Current and Emerging Insights on AI in Breast Imaging: An Interview with Mark Traill, MD, Part 3
Related Content
© 2025 MJH Life Sciences

All rights reserved.