When referrers don’t use clinical decision support, radiologists don’t get reimbursed.
CHICAGO - The buzz about clinical decision support and the need for appropriateness guidelines isn’t new, but implementing and using such tools correctly will soon become even more critical.
According to industry experts at this year’s Radiological Society of North America annual meeting, if your referring physicians don’t master their clinical decision support (CDS) and use it consistently, it’s going to cost the radiologists money.
As of Jan. 1, 2017, said Ramin Khorasani, MD, vice chair of the Brigham & Women’s Hospital radiology department, under the Protecting Access to Medicare Act, radiologists won’t be paid for outpatient, non-emergent services rendered if their claims don’t include a number that proves the referring physician consulted a CDS tool.
But it still isn’t clear how radiology can best teach other providers about diagnostic imaging appropriateness. A recent pilot initiative, the Medicare Imaging Demonstration (MID), showed some improvement in how referring physicians prescribed imaging, but many doctors and surgeons reported dissatisfaction with the CDS software.
“The Demonstration’s intent was to measure and improve the appropriateness of advanced imaging, but the implementation wasn’t an effective means to improve ordering,” said Katherine Kahn, MD, distinguished chair in Health Care Delivery Measurement and Evaluation at the Rand Corporation, responsible for MID evaluation. “There’s a slew of concern from how much time was involved in using CDS to disagreement with the appropriateness rating system to disagreement with the guidelines.”
MID’s goal was to collect data from seven pilot sites around physician use of diagnostic imaging services and whether they ordered studies based on appropriateness criteria established by specialty societies, including the American College of Radiology. The program, which is the largest CDS evaluation in the United States to date, ran from October 2011 to September 2013.
More than 5,100 physicians participated, including primary care physicians, medical specialists, surgeons, nurse practitioners, physician’s assistants, and non-physicians specialists. Overall, they ordered approximately 140,000 imaging studies.
For the first six months of the demonstration, CDS software gathered information without interacting with ordering physicians. In the last 18 months, providers received alerts around 12 advanced diagnostic imaging procedures from three modalities (MRI, CT, and nuclear medicine): MRI lumbar spine, CT lumbar spine, MRI brain, CT brain, CT sinus, CT thorax, CT abdomen, CT pelvis, CT abdomen and pelvis, MRI knee, MRI shoulder, and SPECT MPI.
During MID, 37% of primary care physicians ordered advanced imaging on five or more different body parts, and 23% ordered studies on only one body part. Nine percent of medical specialists and 3% of surgeons requested advanced imaging of five body parts.
“We need to think about lessons learned from this evaluation,” she said. “If CDS is going to impact clinician ordering behavior, we need to focus on the physicians placing orders across many different parts of the body.”
Through the first six months, MID results showed between 62% and 83% of referring physicians ordered studies in line with appropriateness guidelines. After using CDS software, the percentage rose to between 75% and 84%, said Kahn, who is also the associate division chief for research in the University of California-Los Angeles medicine department general internal medicine and health services research division.
Despite the improvement, referring physicians weren’t pleased with the software. Many complained that the tool didn’t offer definitive guidance on up to 90% of the imaging studies they ordered. Others felt the design of the software was too overwhelming to be useful. Overall, on a scale of 1 to 9, medical specialists scored the CDS around 5 or 6 in helpfulness, and surgeons scored it at 2.
The most consistent feedback from the pilot sites, she said, is that a CDS tool should be applied in a more targeted fashion, giving the referring physicians involved a greater opportunity to learn the software and apply it effectively.
“Despite a lot of dissatisfaction by clinicians, there is a suggestion that if we could do a better job of linking the way clinicians place orders with better delivery of the appropriateness guidelines,” Kahn said. “We might have a match here.”