New radiology workstation design can be tested and validated using inexperienced laypersons and look-alike radiological tasks, according to researchers at Simon Fraser University.
New radiology workstation design can be tested and validated using inexperienced laypersons and look-alike radiological tasks, according to researchers at Simon Fraser University.
Evaluation of the design for a new radiology workstation typically requires a user study involving several radiologists who perform diagnoses with the new hardware and software.
It is possible to create tasks that mimic how radiologists commonly use a workstation, however, according to a paper in the June issue of the Journal of Digital Imaging (J Digit Imaging 2005;18(2):109-115). Researchers can then test new design features by observing nonradiologists completing the radiology-like tasks on stripped-down workstations, rather than conducting more costly user studies involving radiologists.
"Very little previous research has been performed on radiology workstation design testing, yet proper workstation design can lead to significant benefits in the efficacy of the radiologist," said M. Stella Atkins, Ph.D., a professor of computing science at Fraser.
Atkins and colleagues evaluated two different workstation interaction techniques with both laypersons and radiologists using a set of artificial targets to simulate the reading of a diagnostic examination.
Atkins first observed radiologists at work and then designed a task and a set of stimuli that allowed her to simulate interpretation workflow using a typical task: identifying anatomic abnormalities in a projection radiology chest reading scenario.
"Looking at mouse clicks, response times, and interpretation errors, we found both groups were very similar. All the subjects mastered the style of interaction in a similar way," she said.
The next step is to extend these tests to include eye-tracking, so that researchers can monitor and record the fixations of radiologists to determine where they are looking while they're performing tasks. This type of study would help determine causes of errors and establish how much time is spent viewing the workstation controls, Atkins said.
Atkins also plans to evaluate other measures for optimizing the user interface, exploring how to introduce new tools to avoid visual distraction.
Comparative AI Study Shows Merits of RapidAI LVO Software in Stroke Detection
February 6th 2025The Rapid LVO AI software detected 33 percent more cases of large vessel occlusion (LVO) on computed tomography angiography (CTA) than Viz LVO AI software, according to a new comparative study presented at the International Stroke Conference (ISC).
What a New Meta-Analysis Reveals About PET/CT Radiotracers for csPCa
February 6th 2025The PET/CT agent 18F-PSMA-1007 offered the highest surface under the cumulative ranking curve (SUCRA) out of nine radiotracers at the patient and lesion level for detecting clinically significant prostate cancer (csPCa), according to a meta-analysis.