Sins of Omission by Ordering Clinicians

January 13, 2012

“Clinical correlation is recommended.” Them’s fighting words, in the right environments. Some clinicians react about as warmly to this phrase as they would to an extended middle digit or an unflattering maternal reference.

“Clinical correlation is recommended.” Them’s fighting words, in the right environments. Some clinicians react about as warmly to this phrase as they would to an extended middle digit or an unflattering maternal reference.

You can’t entirely blame them. A few of our colleagues are awfully liberal with the phrase, using it routinely - almost involuntarily, as if it were some kind of nervous tic or a ritualistic prayer to the Gods of Radiology.

In our defense, we’re routinely given pitifully little clinical background on the studies sent by those clinicians. Some of them are pretty blatant, and can come across us as being rather like a verbal flipping-of-the-bird themselves: “R/O pathology” leaps to mind, or the almost-equally unhelpful “Pain.” Sometimes the pain turns out not to be in the body part requisitioned for imaging. And, as a rare treat, “R/O pain.” I really, really hope none of my clinicians have ever meant that literally.

As bad as these gems are, I’ve grown almost fond of them. At least they’re up-front and honest about withholding the real clinical info from us. Far worse are the histories which pretend to be relevant, but really aren’t - lulling us into a false sense of security by thinking we know why the exam was ordered, so only after reading the studies (and sometimes not even then) do we find out the real deal. I’ve actually heard clinicians say that they “didn’t want to bias” the radiologists by telling them the full tale.

“Chest pain; R/O PE,” for instance, turns out to be a patient with pneumonia (seen on prior imaging at another facility, whose images and report are of course not provided or even mentioned to us), whose fever and white count have spiked to new heights and the clinician is wondering what’s doing. No risk factors for thrombosis, of course. Asked why the extremely-relevant history was left out, clinician admits a fear that the study would otherwise have been performed as a routine CT, rather than a CTA.

“Flank pain.” No mention of which side; those extra strokes of the pen (or keyboard) would be too much to ask of the beleaguered clinical team. Okay, well the study’s noncontrast, so at least we know it’s probably an evaluation for stones. Imagine the surprise when we learn that, no, the patient actually fell down a flight of stairs, and there’s pain in multiple locations. Still, the flank was the most tender, and they thought we’d like to know that. Oh, and the study was done noncontrast in the name of ER “throughput.”

Lest I unfairly tar too many of our clinical brethren with a single brush, some of them are equally frustrated by these omitted factoids which impair our contribution to patient care. Not uncommonly, I’ll hear them expressing disbelief that they wrote out a small paragraph of clinical info when requesting studies, and the final product’s clinical history got reduced to a word or two.

Why? Well, sometimes their nurses or clerical staff did a little simplification and/or editorializing when asked to fill out the requisition-slips. Sometimes the computer system didn’t permit the entry of a more lengthy clinical history.

Be on the lookout for more of the latter - sooner or later, “inappropriate reason for exam” will be blocking their computer-entered orders. And just to get their needed imaging done, clinicians will likely resort to telling the computers whatever’s necessary to get the orders accepted.