Widening the Scope of Root Cause Analysis

March 13, 2014

Root cause analysis isn’t reaching its potential in the radiology community.

Currently, the root cause analysis procedure is only standard following a sentinel event. But some professionals argue that limiting the practice to sentinel events results in missed opportunities for improvement.

The Joint Commission works with accredited organizations to thoroughly review all factors that may have played a part in the event from human error to communication gaps to mechanical failures. While such an analysis is unquestionably important with events causing death or serious injury, radiologists say, there is also need for analysis with a much broader scope of errors or inefficiencies.

[[{"type":"media","view_mode":"media_crop","fid":"23372","attributes":{"alt":"","class":"media-image media-image-right","id":"media_crop_3313542882451","media_crop_h":"0","media_crop_image_style":"-1","media_crop_instance":"1828","media_crop_rotate":"0","media_crop_scale_h":"0","media_crop_scale_w":"0","media_crop_w":"0","media_crop_x":"0","media_crop_y":"0","style":"border-width: 0px; border-style: solid; margin: 5px; height: 255px; width: 300px; float: right;","title":" ","typeof":"foaf:Image"}}]]

“If only serious adverse events are analyzed, there will be very few opportunities for learning and improving,” Jonathan B. Kruskal, MD, PhD, professor of radiology, Harvard Medical School, and chairman of the department of radiology at Beth Israel Deaconess Medical Center in Boston told Diagnostic Imaging.

“Too often I attend [morbidity and mortality] meetings where adverse events are described, or cases shown, and the discussion simply moves to the next case with no consideration of what went wrong, how to fix it, how to prevent it from happening again, and how to share the lessons learned,” he said.

Radiologists can learn from near misses, which are often harder to identify and are not often reported, Kruskal said.

Training needed

The challenge is that radiologists are not well-trained to do RCAs and therefore do them ineffectively or not at all, Kruskal said. While radiologists have become good at diagnosing why the PACS didn’t work, for instance, being open to analyzing clinical errors has been a tougher sell. “This is why we are trying to train more of our colleagues.”

The educational goal is to share different methods for conducting an RCA, show how contributing factors are categorized and can then be considered, how the focus should be on process and not just the human factors that led to an adverse outcome, and how mitigating strategies can be identified and implemented.

The biggest task is orchestrating a cultural shift so physicians want to do these analyses, because they know the outcome will help them do something better.

“Instead of hiding our mistakes, the ideal culture encourages self-reporting such that cases can be pooled and analyzed to identify improvement opportunities,” Kruskal said.

One way to institutionalize RCAs to increase awareness and so clinicians expect to conduct them would be to alternate by month an RCA meeting with a morbidity and mortality meeting, Sumir Patel, MD, chief resident, radiology, Medical College of Georgia at Georgia Regents University in Augusta, said.

At these meetings, it’s important to involve all levels of employees, not just high-level executives. “You want to be able to get the information from the boots on the ground,” Patel said.

Radiologists can use RCA to solve day-to-day problems such as work flow issues, Patel said.

Suppose you have a work list and you see certain studies that have not yet been completed by your radiology technologist, even though you can see the images on the screen. You have clinicians calling for results, but you don’t know whether the examinations have been completed.

That situation would benefit from an RCA to examine and streamline the current process so everyone is clear when the complete results are available.

Another example is if people are referring you for interventional procedures but clinicians don’t know the best way to contact you. You can use RCA to look at the issues complicating the referral stream, Patel said.

Beyond brainstorming

These issues may seem minor, but examining all of the factors at play, categorizing the errors, correcting the errors or inefficiencies and disseminating that information to an entire organization takes a structured analysis.

RCAs go beyond routine examinations of errors or inefficiencies. They are formalized with follow-up structures with checklists and plans of action. The Joint Commission template, for instance, asks clinicians to examine 24 questions including what controllable environmental factors were at play; how equipment performance affected the outcome; what other areas of the organization could encounter the same problem; and how actual staffing compared with ideal levels.   

“If you’re just brainstorming…you might be limited in scope. You might be satisfied with the one example that you found,” Patel said.

RCAs can also turn up latent errors you didn’t know existed. Addressing those errors proactively can help avoid producing sentinel errors down the road, he said.

Sometimes radiologists aren’t part of hospital RCAs – either because radiologists don’t make it a priority or they aren’t included by others – and they need to be key players, Patel said.

“As radiologists have a seat at the table, they’re not going to be the dinner on the table,” he said.