Work list management tools are finding their way into PACS offerings, but that hasn't stopped many radiology departments from launching efforts to develop their own solutions, according to studies presented at the RSNA meeting.
Work list management tools are finding their way into PACS offerings, but that hasn't stopped many radiology departments from launching efforts to develop their own solutions, according to studies presented at the RSNA meeting.
More and more institutions are moving their work lists away from a first-in, first-out model to a system that red-flags high-priority studies. This way, busy departments can zero in on the most urgent cases.
Radiologists at Cincinnati Children's Hospital Medical Center, for example, launched RadStream last summer. This Web-based, homegrown automated triage system ensures priority for the most acute cases and immediate communication of urgent results to referring physicians. Its use has improved reading times for inpatient and outpatient exams and reduced final sign-off times for emergency department exams, according to Dr. Mark Halsted, Cincinnati Children's chief of radiology informatics.
RadStream also provides documentation and keeps a permanent log of all communications regarding cases, including the time, date, and staff involved in each communication event. An in-house survey revealed that the hospital's radiologists like the new tool. After implementation, radiologists reported a 22% decrease in daily interruptions, which translates into nearly 1500 radiologist-hours saved per year, Halsted said.
"We need to help busy radiologists prioritize their tasks," said Dr. Matthew Morgan, a radiology resident at the University of Pittsburgh Medical Center.
Morgan and colleagues developed an in-house software algorithm that highlights time-critical exams and automatically prioritizes a radiologist's work list. When developing the system, researchers defined two types of urgent exams: those from the ER, and those read by radiology residents or by clinicians providing preliminary reads.
Morgan's system is different from traditional filtering methods, which look first at division and study time as parameters. The logic built into the software prioritizes exams by patient location as well as STAT and urgent notifications by physicians. Use of the system in the neuroradiology division has decreased the time from exam completion to final interpretation from 9.6 hours to 7.8 hours for urgent studies.
Addressing the need to streamline communication with referring physicians, Morgan's group developed four urgency levels:
Category 1. Findings are life-threatening, need high levels of attention, and require immediate synchronous communication. Minutes count in patient management.
Category 2. Findings need high levels of attention but can be managed in minutes to hours. They require reliable asynchronous communication such as e-mail.
Category 3. Findings can be treated before the sun sets. They can be communicated via synchronous/interrupted modes of communication.
Category 4. Findings do not warrant high levels of attention. Communication could be conducted through a radiology Web portal for physicians.
The software provides context-specific messaging for each different type of communication.
"One-size communication does not fit all," Morgan said.
Dr. Dan Cohen, a radiology resident at Massachusetts General Hospital, and colleagues developed a system to notify their requesting physicians of urgent findings. The alert is triggered by the radiologist and sent via e-mail to the referring physician. It does not contain the actual exam results but instead provides a link to them. To avoid desensitizing physicians to the alerts, they are not used for minor findings, Cohen said.
During an eight-month period, radiologists at MGH completed more than 390,000 exams and sent 8210 alerts. About 70% of these were viewed by physicians, but more than 20% of the alerts did not generate return receipts, indicating that physicians may not have opened or received them.
A survey revealed several reasons why physicians missed the alerts:
- They had already seen the important finding in the HIS.
- They had never ordered a particular study.
- The patient was no longer their patient.
- Their systems were outside the hospital's firewall.
These missed alerts could be reduced by hiring dedicated staff to follow up and make sure that physicians receive and respond to them, Cohen said.
Dr. Kevin McEnery, an associate professor of radiology at the University of Texas M.D. Anderson Cancer Center, continued the theme of what he called smart radiology.
"The radiology workflow needs to be aligned with clinical workflow," McEnery said.
He outlined how his institution moved away from the typical first-in, first-out mode of interpretation by developing a work list engine that gathers data from the PACS, the RIS, the EMR, and the HIS. The engine allows researchers to manage the workflow of close to 1500 cases. Radiologists in the department have expressed satisfaction with the workflow, McEnery said.
That satisfaction is due in part to eliminating repeated interruptions caused by ad hoc prioritization, according to McEnery. Inpatient studies and STAT cases receive higher priority.
These homegrown approaches reflect a dearth of work list solutions from the commercial community, said Dr. Paul Chang, chief of radiology informatics at the University of Pittsburgh, during a panel discussion. But expecting vendors to have a product that pleases everyone may be unrealistic, given the various idiosyncrasies of users.
One solution would be for vendors to provide service-oriented architectures, which would allow departments to customize products to fit individual needs, said Dr. Keith Dreyer, vice chair of radiology computing and information sciences at MGH.
Can MRI-Based AI Enhance Pre-Op Prediction of Tumor Deposits in Patients with Rectal Cancer?
October 31st 2024For patients with rectal cancer, an emerging nomogram that combines deep learning and clinical factors had greater than 16 percent and 23 percent increases in accuracy and specificity, respectively, for pre-op prediction of tumor deposits in comparison to clinical factors alone.