• AI
  • Molecular Imaging
  • CT
  • X-Ray
  • Ultrasound
  • MRI
  • Facility Management
  • Mammography

The Complex Systems of Radiology and the False Simplicity of Working in Them

Article

For system errors and failures in radiology, are we prone to a satisfaction of search that prevents us from addressing deeper issues?

I recently stumbled across “Complex systems,” an entire intellectual field I never heard of before. The idea itself isn’t new, but I had no idea there were academic programs dedicated to it, a body of literature, etc. There are even dedicated journals (including one that is unsurprisingly called Complex Systems).

My introduction to the subject was a 1998 writeup, How Complex Systems Fail, by Richard I. Cook, M.D., out of the University of Chicago. Dr. Cook focused a chunk of his career on patient safety. Health care is, after all, a great example of a complex system, and one can imagine patients being at risk when the system fails.

Descriptions of complex systems often reference Aristotle, whether or not they credit him Complexity makes systems greater than the sum of their parts. There's near limitless potential for errors (failures, accidents, and catastrophes are some other terms that get used) with the interplay of numerous pieces of the system. Practically speaking, errors are commonplace and unavoidable.

The importantly bad stuff happens when multiple errors intersect, something a complex system avoids by having redundancies to cover for one another. Imagine a safety net that's full of holes, but if you layer several on top of each other, they cover one another's flaws.

Accordingly, if some hardware breaks, other hardware engages to ignite a warning light or make a beep. Perhaps software is triggered to prevent usage of the broken item, or personnel, with proper training, routines, and checklists, see something is amiss and don't proceed as usual. But if all of these things fail in conjunction … well, you have probably heard of a few avoidable disasters in MRI suites.

Couldn't it all be made to work properly? Buy the best of everything, hire the best of everyone, and constantly police it all to fix anything that degrades from 100 to 99 percent? Practically speaking? No.

Making things worse, the system isn't a static entity. Equipment is constantly being used, broken, fixed, replaced, or upgraded. Humans in the system come and go, (mis)learn new things, forget old ones, etc. There is perpetual change in the mixture of potential failures. Many parts of the system are operating with flaws or are not entirely compatible to begin with. Even if you make everything perfect today, this probably won't be the case tomorrow.

Ultimately, humans in the system are expected to catch whatever falls through the cracks. Dr. Cook makes a salient point: "Human operators have dual roles: as producers (and) as defenders against failure." In other words, since we're talking about health care, a doctor isn't just there to function as a physician, but also to make sure his or her corner of the health-care system doesn't break down.

Let’s put this in radiological terms. You can't just sit and read studies, do procedures, attend tumor board and whatever committees you serve, etc. Like it or not, your function is also to perpetually be a beta-tester for the flawed PACS/RIS you had no role in choosing, ability to fix, or authority to replace.

You have to ride herd on whatever ancillary staff didn't get proper training or aren't paying attention to what they're doing, even if you can't personally discipline or fire them. You need to chase down referrers who might not properly read your reports on their patients, and somehow make sure the patients do what they're supposed to (like having follow-up studies).

Do you think some of this stuff shouldn't be your burden? Others will disagree and not just hungry malpractice lawyers looking for a payday. One of Cook's points is that a lot of the postmortem analysis of failures suffers from overly simplistic thinking: a notion that there was one "root cause" rather than a combination of failures intrinsic to the system. There is this idea that someone has to be held accountable.

Those blame assigners aren't the only ones prone to oversimplification. Most of us cogs in the machine are just as predisposed. From our perspective, every time we catch an error and prevent it from turning into something bigger, we confidently point at the single thing that we know made that error occur.

If only someone listened to us about why we should have more accurate voice recognition, better protocols for workflow, etc., we insist that everything would dramatically improve. Instead, here we are, trying to be catchers in the rye, saving everybody from the horrible things the system keeps trying to do to them, and trying to save our own professional appearance from looking too degraded by the suboptimal conditions in which we work.

You don't have to be a part of the system to have that perspective. Anybody who's ever been a dissatisfied customer knows the feeling. Personally vexed and emotionally involved in the situation, you have absolute certainty that the employee who handled your case should have done this or that differently. The website or touchtone phone-maze of a particular business should have been set up some other way. You might be completely right -- your personal issue might have been avoided -- but you can't see how your "obvious" fix would fit into (or otherwise disrupt) the complex system of that business.

Remove yourself from the situation sufficiently, and it becomes easier to see the big picture, getting the "10,000-foot view." If you haven't traveled by plane recently, for instance, you might contemplate the airline industry and imagine how it could never operate with flawless efficiency. The best that can be hoped for is a lack of major disasters. Good luck having that equanimity about health care while you're working in the field (or, God forbid, experiencing it as a patient).

Related Videos
Improving the Quality of Breast MRI Acquisition and Processing
Making the Case for Intravascular Ultrasound Use in Peripheral Vascular Interventions
Can Diffusion Microstructural Imaging Provide Insights into Long Covid Beyond Conventional MRI?
Emerging MRI and PET Research Reveals Link Between Visceral Abdominal Fat and Early Signs of Alzheimer’s Disease
Nina Kottler, MD, MS
Practical Insights on CT and MRI Neuroimaging and Reporting for Stroke Patients
Related Content
© 2024 MJH Life Sciences

All rights reserved.