Photosonography, Art, and a Clinical Surmise
Photosonography, Art, and a Clinical Surmise
On the Practice of Ultrasound
I received an ad on a social medium from teespring.com. It offered a text T-shirt with this: “Doctor: noun. [dok-ter] someone who solves a problem you didn’t know you had in a way you don’t understand. See also wizard, magician.”
I like it. Finding and fixing an unknown problem is what early diagnosis is all about. This is both an ongoing and emerging area for ultrasound. The ‘understanding’ part has ever been the case in medicine. Getting around that hurdle, to be able to reconcile scientific knowledge with vast uncertainties while having to act, is what practitioners do every day. Doing this is one of the great achievements of man.
How do we decide what, when, where, and how we make the best use of ultrasound as our tools become more dependable, our own knowledge and experience increase, and as new opportunities arise for sharing and integrating with other facets of health care? This is a grass roots issue pertinent to us all. Actually, these are the same issues that have confronted artists ever since France’s first painter in residence decided to daub his or her newest mineral pigments on stone slabs at the Grotte De Lascaux 20,000 years ago. As a technical journal, stone walls are hard to circulate and the message is monolithic, but, boy, do they last.
Artists’ works are creative acts. Art may be pressured by sociocultural forces; its execution involves intimate familiarity with the subject matter, the medium, and the tools at hand. I think that a lot of problems that face ultrasound come from failing to identify the operator as a visual artist first and as a user of ‘scientific’ tools second. The ‘traditional’ artist seeks to reveal something about his subject, it could be a mood or a memory, and to select the medium and technique to convey that notion. The ultrasound artist has one task, which is to represent pathology with utmost clarity.
The Tool of the Trade
All of the visual arts have commonalities. There is some display medium that is intended to be viewed. I am thinking of a human viewer, whether or not there are human or machine steps of acquisition or processing along the way. Transducers are our brushes, painting a slice or volume of tissue with sonic energy and then making the interaction of matter and dark light visible. The painter controls his or her work by selecting the surface, the pigments, and brushes, and follows a conceptual map. Our controls are in the selection of equipment and the patient cohorts we will see, in the operation of the equipment itself, and in the light of our own knowledge and experience.
Equipment choice is, unfortunately, not always in the hands of ultrasound imagers themselves. So a lot of our control comes down to equipment settings during an exam. I have never liked the term ‘knobology’, even though I admire the quaintly retro sound. We have to deal with patients of wildly different sizes and shapes and a range of detectable pathology affected by comorbidities and ultrasonic appearances affected by geometry and environmental factors. Knobs, buttons, and sliders are how we compensate as well as we can for those variations. Part of the learning process is understanding what every control does by itself and in combination with the others. Operators should strive for the ability to play the instrument to create the most harmonious sound during an exam without having to engage the slow thinking apparatus (see Behavioral Ultrasound). In short, you need to grok it.
All of these things are or should be obvious to the ultrasound professional and a part of traditional educational programs. But this has become a timely issue as more and more nonimaging specialists go from equipment purchase to direct clinical use by dipping into the community chest instead of going to jail first. Ultrasound has never grappled well with professional standards. For a developing, technically unstable field that is both a boon and a detriment. Perhaps, there is an alternate source of knowledge that we already have that can be an educational boost for ultrasound use, especially for beginners? Everyone does some kind of photography now. Let’s see if using that framework will work.
Ultrasound and RADAR are identical twins living in different homes. Digital cameras and ultrasound are fraternal twins and kissing cousins. An ultrasound imaging system is a digital camera with a specialized source of illumination for seeing sonic swarms. If you know how to use a digital light camera, you know how to make ultrasonic images. Photography is an art form, whether it is a selfie done with a smart phone or deep space photographs through the Hubble telescope with a bajillion pixel one-of-a-kind DSLR and image processing on massively parallel computers.
I have a recollection that it takes eight or nine photons arriving simultaneously to trigger a response from a rod. The threshold for a photoresponse for a film or light sensor (like a CMOS array) is its ‘speed’, graded as its ISO number. The lower the speed, the better the image quality becomes. A very high ISO, might let you take a picture in near darkness, but the final image will be noisy and have low contrast. The range of energy from response to overloading the sensor is its ‘latitude’. The aperture is standardized as f-stops, which are calibrated so that an up or down adjustment of an f-stop preserves energy admitted to the camera by a compensatory change in exposure time. For example, with an ISO 25 film, and a bright sunny day, f11 and 1/100th sec is effectively the same as a 6-stop change to f5.6 and an exposure of 1/400th sec. Does this make any difference? Well if you are at a beach and you want all the bulging pecs and skimpy bikinis in sharp focus, you would go with f11 or higher. But, if you wanted to make a Valentine’s message including a single, perfect rose in the garden during a stiff breeze, you would pick a low f-stop and high speed, freezing motion and blur out the background. These are the kind of operational decisions that every adept photographer makes, and usually by visualizing the result that he or she wants.
Light cameras take advantage of the photo-electric effect, ultrasound units of piezo-electricity. Ultrasound phonons are way less energetic than light photons and propagate by exchanging vibrational energy in latticed structures. They’re glacially slow by comparison and get distracted easily, bouncing hither and yon as they transit composite biological materials. Scattering and absorption are different, but the steps along the image chain are the same.
Gain and Power Output
The basic ultrasound control is gain, overall or in-depth compartments. Transducers have a relatively low lens-equivalent aperture. Simple beam former units make a composite image from one to a few separate focal zone compartments. The newest dynamic beam former units (plane wave and quasi holographic units), can achieve broad focal zones through signal digital processing). Changing the gain is like increasing the ISO in a digital camera. The higher the gain, the higher signal amplitude, the brighter the image, and the more noise fills in signal-free features. Early ultrasound machines are like moving a slit in front of a camera while flashing light hundreds or thousands of times a second. More modern systems have sonic sources that are more like strobe lights, short in duration, high in intensity, and have a very broad bandwidth. The spectral features of the ultrasound signal are the equivalent of the color composition of the light. I’m not going to go into this further, because it is a differential factor between lines of equipment but which should be consistent for identical models from the same manufacturer.
I tend to operate at the highest power output settings that the system will allow. There are regulatory limits imposed on manufacturers, which as far as I can tell, are completely arbitrary. Max power like bright sunlight means better image quality (which mostly comes down to less noise, although nothing is ever simple). Gain (which operates on received signals) is minimized. Also, better images mean quicker exams with less overall exposure, this tactic is pro-ALARA. To be fair, ultrasound dosimetry is a kind of a bog with few landmarks. Ultrasound pulses have peculiar waveforms which change as pulses propagate. It is not at all clear what index is representative of actual energy transfer in tissue, nor is there a good way to determine actual energy transfers from low power diagnostic units under actual scanning conditions.
Some units have a control for ‘persistence’ which averages a few frames, others have this as a built in feature. About four frames are best for most applications. This is a little like increasing the exposure time for the same aperture without overloading the sensor. Superimposing frames and having a steady hand, real echoes sum, while random noise won’t all be in the same place in each frame. The signal-to-noise ratio is related to the square root of the number of samples. So, four frames will get you twice as much noise reduction as no frame averaging, and it won’t slow scanning speed perceptibly. Try this with a carotid in cross section sometime. With a lot more averaging, brightness level goes up and contrast goes down and moving targets will blur. If you do a carotid, be sure that it fills as much of the image as possible. Viewing a very small screen or having the object of the exam take up a small part of the display matrix might not show speckle noise, but neither does it show fine detail.
I think the most important control (and the one least used and appreciated) is ‘Dynamic Range’, DR, or log signal compression. Remember exposure latitude? The range of echo amplitudes for most tissue fields is greater than display units can handle; logarithmic signal compression funnels the range of echo amplitudes down to the latitude that can be displayed (and perceived). Displays are vastly better for contrast ratio than they were even 10 years ago, but this is still a critical setting, and it works in conjunction with the look-up table that governs the gray shade or color hue that is assigned to small blocks of signal amplitude.
Some mid 1970s pioneers of gray scale imaging made two egregious errors. The first was to display images black/gray on a white background. Think about it: can you see stars during the day? The other was to assume that organs are uniform solids and to have the esthetic notion that they should have a uniform, smooth texture. To get that result they operated with a lot of signal compression and were proud that the units of the time were able to display 70 or 80 dB of signal instead of a measly 40 or 50. That’s wrong. The display dynamic range wasn’t much different but signal compression brought up the noise and made it as bright as real tissue filling in voids. Perceptible contrast went to nil. A specific consequence was a plummeting diagnostic performance for detecting (or excluding) primary and secondary malignancies in the liver, they didn’t look different from the background.
The original idea to set a very high signal compression and leave it alone afterwards is an artistic choice, but a bad one for ultrasound. There is a comparable issue in photography, which can be addressed by weighted averaging of multiple identical images with different exposure factors, referred to as ‘HDR’ processing. For now, realize that contrast is the most important issue for visualizing focal lesions and regional differences in the collagen composition of tissues. Adjust and readjust the DR for optimum visibility of what you want to demonstrate while you are scanning. Seek the settings that look best when you start scanning, before you look at the content of the images.
Going back to the 1990s, a lot of manufacturers realized that there were users who didn’t like to adjust their devices for each patient and each kind of study. I remember visiting facilities with controls taped over, justified as “those settings work” (and could never be found again). A few companies had one-step settings based on the distribution of signal amplitudes in the image to use as starting points for an exam, leaving all the controls operative. A much more popular approach was to have a button that had settings that were said to be best for some applications, like the thyroid or the liver, and sometimes there were additional factors like obesity or trimester of pregnancy. Those presets were just combinations of the available controls, which is pretty insulting to someone who is an ultrasound professional. It is also a flawed notion, because pathology is the fixed quantity that we want to find, organs are the background variable. That counts for parenchymal disease, even if the metaphor is limited.
Digital imaging has changed everything. Ultrasound devices have or should have basic controls that are optimized during an exam, there is a second level of controls, like the way that gray shades are mapped to signal strength, that tend not to need a lot of adjustment. And then, there are a vast number of settings that are hidden from users completely. These are the software controls for beam formation, image synthesis, and image processing that are interactive, in which some settings are unstable and will cause system crashes or result in lack of image fidelity, and where changing some options may require going through separate approval processes from regulatory agencies. This is a place where different regimes of processing can be selected by the operator as presets, ie before imaging occurs. It’s really a pity that the same word is used. The presets in dynamic beam former systems are labeled by their intent, like harmonics, compounding max rez, max depth, not as anatomic regions. This is actually the main area of ultrasound equipment development and it is also the umbrella over transducers, display options, and all the other hardware components.
I take a lot of photographs. I’m very attached to my digital camera and to my lenses. There are lots and lots of image processing options built into the camera, but I rarely use them. Instead, I try to compose the image and get the exposure factors in the range I want. I save all of the info in RAW format, and then I process every picture I select afterwards. I do all the initial work in Lightroom and then do subtle touches in Photoshop. If you visit my website you will see that I prefer zero grain, clear, realistic images. I practically never render images or otherwise distort them. Dynamic beam former systems are software driven, necessarily digital.
When we have a device that stores all the raw data for the display matrix, then we have all of the options for digital image processing that are available to anyone, anywhere. We tend to do our exams, make images, and then abandon them, as is, afterwards, with exceptions like displaying images on ward rounds on a workstation, where you can change video brightness and contrast, do some rotations, or other simple transformations and annotate images. When I look at a lot of images at courses and in publications, I often have the feeling that a little postprocessing would make all the difference in the world didactically, especially for people who don’t like to imagine seeing lesions in a swirling fog. That was then, not now.
This is a low noise, dynamic beam former image. You have the sense that you are looking at tissue features directly. The dynamic range is distributed from jet black to bright white. The diagnosis should also be apparent. The discrete patches and psamomma bodies imply episodes of inflammation and healing. The inference is that this asymptomatic lesion has limited growth potential and may be monitored. Consider this: that the primary X-ray sign of in situ breast cancer is finding microcalcifications. These represent one of the body’s responses aimed at containing a cancer. Perhaps this model provides an explanation of why a lot of the breast cancers that mammography detects, and which are real, are not clinically aggressive.
The images are not of exactly the same location or magnification. There is no difficulty in seeing the exophytic nodule or appreciating the texture of completely normal thyroid parenchyma (ie, no gland suppression). This lesion is smaller and ‘newer’ than the one in Figure 1. Figure 2 panels are with different transducers, frequency spectra, and presets. Is there a difference, does it matter, is it significant? The opinion depends on the eye of the beholder, and that is what this article has been all about.
Ultrasound imaging is not fun, and it is not a hobby. You cannot divorce the image from the patient under the probe, the equipment that acquired the data, and the person who views it, either for quality control or interpretation. These things taken together define ultrasound as an art.