Deconstructing the Root Cause Analysis in Radiology

December 4, 2013

CHICAGO - A root cause analysis can help prevent future mistakes, and isn’t just a way to place blame on clinicians. Here’s what radiologists need to know.

CHICAGO - Any clinician can find himself called into a root cause analysis after something has gone wrong.

Radiologists may be called if a patient codes during a procedure. Or maybe you’ve read an X-ray and the patient codes later in the day. You’ve never laid eyes on the patient but you may be called into the analysis.

Root cause analyses - RCAs - are called for by the Joint Commission in the wake of a sentinel event, and the JC has clear guidelines to follow.

Sumir Patel, MD, chief resident, radiology, Medical College of Georgia at Georgia Regents University, explained at a session of RSNA 2013 that the intent is not to blame a particular person. “We’re looking at systems and processes, not an individual performance.” The ultimate goal is to decrease the likelihood that the error could occur again.

Identifying the root cause can expand learning beyond a particular event as other departments see trends and patterns relevant to their work.

Root causes in a hospital can be found anywhere in the system from human error to machinery, to education, to protocols.

Jim Rawson, MD, FACR, chair of radiology at Georgia Regents University, gave an example of a 4-year-old spilling milk on a rug. A root cause analysis would include questions such as who chose the cup? Who poured the milk? Who got yelled at? Was the system designed for the milk to spill?

That translates in a hospital to questions such as did short staffing contribute to the error? Did the clinician or staff member have the proper training for the procedure? Were protocols followed? Is there a checklist for a procedure? Does staff need more cross-training? Do all staff members know they can stop a process at any step of the way?

Norman Thompson, MD, MBA, associate professor at Georgia Regents University, gave a hypothetical example of an error that occurred when a majority of radiologists in a particular practice were at a professional conference. In that case, could elective cases be scheduled on a different day?

Maybe the equipment enables the user to make a mistake. “Too many pieces of equipment are designed that you can keep bypassing the dose limit and have an adverse event,” he said. “We need stops built into the hardware that make it difficult to make human errors because ultimately we are human. We are wired to make mistakes… Our systems, our policies, our processes have to provide a backstop for our patients.”

Staff can benefit from scenario planning, Thompson said, such as what to do if the ultrasound fails. The benefit is not necessarily in anticipating a particular scenario but getting the staff to think outside the box and be agile in the case of unexpected events.

Thompson said that while healthcare in general does a good job of getting to the bottom of how errors happen, the system does not do as well compared with other industries in sharing the information broadly.

If an airline discovers it has a problem with an engine part, the information is widely distributed throughout industries from manufacturers to end users.

Not so in healthcare, he said, where there is currently no way to disseminate the information learned from an error across all institutions.