Radiology Practices Using Real Data in the Real World

October 23, 2014

Business analytics are useful for more than just a radiology practice’s financial performance.

Operating a successful radiology practice has always required more than just keen financial instincts. And as practices get larger and more sophisticated, the era of managing by gut is most decidedly over. Diagnostic Imaging asked three Radiology Business Management Association (RBMA) members to share ways they are using data in real world situations to improve practices’ performance.

One of the biggest things looking at data does is help clearly define what your current state is, says Lisa Mead, RN, MS, a quality and leadership development consultant for Crowne Healthcare Advisors. Once you know exactly where you are, you can discuss that status in an objective nonconfrontational way and work on solutions. Finally, once you implement changes, data analysis allows you to clearly define your goal and measure whether the changes helped you achieve that goal.

A simple example Mead has employed is pulling data from the practice’s phone system to address patient dissatisfaction from long hold times and problems getting through.

To solve this problem, the instinct is to simply put more people on the phones, but a better approach is looking at what the data are telling you, Mead says.[[{"type":"media","view_mode":"media_crop","fid":"28744","attributes":{"alt":"Lisa Mead","class":"media-image media-image-right","id":"media_crop_3213565162268","media_crop_h":"0","media_crop_image_style":"-1","media_crop_instance":"2932","media_crop_rotate":"0","media_crop_scale_h":"0","media_crop_scale_w":"0","media_crop_w":"0","media_crop_x":"0","media_crop_y":"0","style":"height: 230px; width: 161px; border-width: 0px; border-style: solid; margin: 1px; float: right;","title":"Lisa Mead, RN, MS, Crowne Healthcare Advisors","typeof":"foaf:Image"}}]]

To help the practice identify the system problems at the root of its long hold times and high percentage of abandoned calls, Mead got reports on the number of calls, the number of calls by agent, the number of abandoned calls, and the amount of time people were in the queue before they hung up. The key indicator was the abandoned call rate, and the practice set a goal of getting this number under 2%.

Then Mead, the practice managers, and the staff worked together on solutions to achieve that hard statistical goal, examining every step in the process and also what other duties the staff taking calls were responsible for at the same time as they were answering the phones. Data dashboards helped keep the metrics transparent and visible to all.

“[Inefficiency] is typically no one’s fault, it is just how businesses are run,” Mead says. “We add on new processes and don’t really think through how that is going to work and how much time that is going to take.”

Pulling meaningful information from the hard data is key to creating the incentive for change, Mead adds. “It helps everyone realize there is an issue and it is an issue we can get our hands around with a clear goal, because if you just say ‘people are complaining they are on hold too long,’ no one can respond to that.”

From Calls to Quality
The principles of using data to clearly define a problem and then work on solutions in a nonconfrontational way also work in more complex situations, including reviewing imaging protocols to improve quality and safety. In January 2013, Michael Bohl, executive director of the Radiology Group, PC, SC, in Davenport, Iowa, began submitting data to the ACR Dose Registry Index to see how his practice compared to national averages on various types of imaging studies.

Bohl didn’t really expect to find anything, as he considered his practice’s technologists, radiologists, equipment, and protocols to be very good. And indeed, when he got his first report back after six months, most of the imaging studies his practice did were either in line with or better than national averages. However, for a few types of studies, doses in his practice were higher than the national averages.

“I was surprised at what we saw,” Bohl says. “Without something like the ACR Dose Registry, you don’t really know where to start.”[[{"type":"media","view_mode":"media_crop","fid":"28746","attributes":{"alt":"Michael Bohl","class":"media-image media-image-right","id":"media_crop_8827727920623","media_crop_h":"0","media_crop_image_style":"-1","media_crop_instance":"2933","media_crop_rotate":"0","media_crop_scale_h":"0","media_crop_scale_w":"0","media_crop_w":"0","media_crop_x":"0","media_crop_y":"0","style":"height: 201px; width: 161px; border-width: 0px; border-style: solid; margin: 1px; float: right;","title":"Michael Bohl, Radiology Group, PC, SC","typeof":"foaf:Image"}}]]

The data allowed Bohl to go to his technologists and radiologists and engage them around dose management in a different way than traditional top-down management.

“This allows me to go to them and say, ‘look, our CT chest doses are higher than the 75th percentile. Why is that? What is our protocol?’” Bohl says. “It is having them involved in looking at the protocol and thinking about what is driving the dose higher than it ought to be and then having them be able to suggest changes.”

One solution the technologists came up with was to reduce dose in soft-tissue neck scans with contrast by delaying the start and decreasing the frequency of the monitoring scans.

“It is a real opportunity to engage staff that is a little bit different than their daily routine,” he says.

Data and Imaging Utilization
Financial, operational, and clinical uses of data will only grow as the ability to collect and analyze practice data improves, but as the amount of data practices can collect increases, it also is important to pay attention to which metrics truly matter and who owns that data, says Liz Quam, executive director of the CDI Quality Institute, the non-profit organization directly affiliated with the Minneapolis, MN-based Center for Diagnostic Imaging (CDI), a multistate provider network for diagnostic imaging, interventional radiology, and mobile imaging services.

As it became time for CDI to upgrade and rebuild its data warehouse, it led to some tough discussions about what the use of data will be in the future, Quam says. One novel use of data CDI has pilot tested with some of the primary care groups it serves is sharing the appropriateness score on individual studies with the ordering physicians. To achieve this, contracts had to be carefully constructed so that the appropriateness scoring would remain confidential between CDI and the ordering physicians’ organization. “We certainly don’t want those scores to be used inappropriately, but giving them to the physicians themselves is probably very useful,” Quam says.[[{"type":"media","view_mode":"media_crop","fid":"28748","attributes":{"alt":"Liz Quam","class":"media-image media-image-right","id":"media_crop_754631804501","media_crop_h":"0","media_crop_image_style":"-1","media_crop_instance":"2934","media_crop_rotate":"0","media_crop_scale_h":"0","media_crop_scale_w":"0","media_crop_w":"0","media_crop_x":"0","media_crop_y":"0","style":"height: 161px; width: 161px; border-width: 0px; border-style: solid; margin: 1px; float: right;","title":"Liz Quam, CDI Quality Institute","typeof":"foaf:Image"}}]]

The practices in the pilot report that having the appropriateness score on the ordered studies allows them to go back and have their own objective data-based discussion within the group about individual physicians’ ordering practices.

In one recent case, it led the ordering physicians to review their ultrasound ordering because the ultrasounds were leading to follow-up studies so frequently that it was likely best just to skip them and go straight to the more detailed study.

Quam also notes that practices will need to address the requirement created by the latest sustainable growth rate (SGR) formula patch that all advanced imaging orders for Medicare patients go through some form of clinical decision support (CDS) by the end of 2017.

“All [radiology professionals] have to figure out how to do this,” she says. “Are we going to do it through an EMR system our partner hospital has? Are we going to offer it ourselves, such as CDI is doing through a physician portal, especially for those physicians who are not connected to the EMR? And once we have done that, what kind of data do we want to get out of it?”

One of the hardest things to look at with data is correlation of appropriateness scores to findings, Quam notes. “We can do that because we can do a word search of our reports and there isn’t anyone else who can do that, including the radiology benefit managers who are trying to figure out how to morph into the CDS space,” she says.

In addition, CDI has begun looking at how many times primary care physicians initiate a consult with the radiologist to get some guidance when an order with a low appropriateness score is placed. “That is of great interest to a couple of our primary care groups in the pilot phase that have risk sharing arrangements with payors,” Quam says.

Thinking through carefully what data you will need, who owns the data and how the data will be shared is a big challenge in future contracting, Quam predicts. “Ownership of data and access to data is going to be more and more an issue of importance to radiologists,” she says. 

It is one of the many hard issues practices will confront along the road to realizing the promise of data in real-world situations.