In Radiology, Man Versus Machine
In Radiology, Man Versus Machine
Call it artificial intelligence. Deep learning. Computer cognition. Whatever its name, it’s the same thing – machines recognizing clinical problems in digital images ahead of the radiologists charged with making the diagnosis.
The artificial intelligence (AI) trend is new, but it’s gaining ground quickly, according to industry experts. The advent of these technologies and radiology’s growing interest in and dependence on them has been discussed at national and international meetings, including the RSNA, HIMSS, and SIIM annual meetings, during the past year. But, there’s still a long way to go.
“We’re just barely scratching the surface of using artificial intelligence in the last few years,” said Eliot Siegel, MD, professor and vice chair of research information systems for the University of Maryland Department of Diagnostic Radiology and Nuclear Medicine. “There’s an emergence of increasing interest in the largest companies in the world, including Google, Microsoft, Apple, and IBM, in actually starting to use these technologies for data extraction and evaluation.”
AI opens the door for radiologists to compare new images with similar, existing ones, said Siegel who also serves as the chief of imaging for the VA Maryland Healthcare System and has spoken about AI use in radiology.
The Case for AI
In effect, AI is the next generation of clinical decision support – technology designed to enhance a radiologist’s ability to identify and correctly diagnose any problems caught on diagnostic images. The trend first began with the introduction of electronic medical records (EMR) and the compilation of patient data in one central location. Its use has since expanded into clinical analytics, mining imaging data to improve medical treatment.
Any AI technology must be correctly loaded with clinical and peer-reviewed data that can be compared to any new images, Siegel said. Only then can it prompt a quicker, more accurate diagnosis. In fact, he said, in some cases, AI can cut the time invested in searching for comparative images by 80% to 90%.
“So many tasks that were previously run by humans can now be equally or better done by computers,” he said. “Look at how we apply advanced computer technologies – the implication is huge with medical imaging.”
When used correctly, he said, AI technologies house complex data from MRI, CT, ultrasound, and PET machines. Because it’s held in the same repository, the information is easily searchable. Having such easy access will help you as you craft advice to referring physicians.
What Vendors Are Doing
According to industry vendors, mammography and breast cancer screening is the easiest segment of diagnostic imaging for testing AI efficacy. On average, said Steve Tolle, chief strategy officer and president for iConnect Network Services with Merge Healthcare, radiologists miss 15% of breast cancer diagnoses. These mistakes are largely due to fatigue or overlooking a malignancy because there’s an assumption of normalcy. Machines, he said, don’t get tired, and they view every part of an image equally.
AI can also take breast cancer diagnosis a step further, incorporating real-world, pre-existing images, said Igor Barani, MD, chief medical officer of deep learning healthcare company Enlitic. Instead of relying on BI-RADs or risk stratification, Enlitic’s technology pulls the most relevant data from past images and makes them searchable to increase efficiency in designing a patient-care plan.
“Deep learning is particularly useful in radiology because there are a lot of data variables accessible in electronic formats,” he said. “There’s clearly a substantial need to speed up radiology given the growth of medical imaging and the pressure of medical imaging being seen as a big contributor to health care costs.”
Alongside Enlitic, Merge Healthcare has partnered with IBM to introduce several AI tools. Work is underway to make them commercially available. For example, the company has developed an iPhone scanner that can diagnose mole malignancies with 90% accuracy, Tolle said.
In addition, this year, Merge plans to introduce a disease-specific audit service that offers more detailed – and searchable – information about cardiovascular disease, cancer, and chronic obstruction pulmonary disease. An EMR summarization tool is also in the works to help radiologists and cardiologists identify what information they might need from a patient’s record to better understand their diagnostic images. Through the partnership, they also plan to introduce a smart MRI that can analyze entire images and pinpoint problems that need a radiologist’s immediate attention.
These tools will make radiologists’ jobs easier, but, Tolle acknowledged, there are still challenges to widespread adoption. First, many people still have a negative impression of AI from movies, such as The Matrix or 2001: A Space Odyssey, where machines endeavor to eliminate humans. More realistically, he said, it will be more difficult to secure approval from the Food & Drug Administration and sufficiently train radiologists and their staff to use AI technologies.
Ultimately, he said, this trend will only augment the radiologist’s place in health care.
“We’re working on a platform that really makes the radiologists what they were historically,” he said. “Back in the day, doctors had to come to radiologists to look at pictures on the wall, and they had to talk with radiologists about the cases.”
AI puts radiologists back in a strategic partnership with referring physicians because they can provide much more robust information about what they believe is going on with a patient and what treatments might be best.
“We’re not trying to replace the radiologist,” he said. “We’re trying to give them a fighting chance of keeping up with volume and dealing with the occasional surfacing of something they haven’t seen before.”
But, some in the industry worry increased use of AI brings significant challenges. According to Jenny Chen, MD, chief executive officer and founder of 3D healthcare printing company 3DHeals, AI use presents problems, while reducing the amount of time radiologists spend reading images.
As AI use increases and reduces time spent reading studies, the price tag on a radiologist’s diagnostic time drops, perpetuating the trend toward commoditization, she said in an online forum.
It’s also possible for a high level of AI accuracy to fall short of satisfying patients. Even with 99% diagnostic accuracy, there could be thousands of life-changing misreads. The result, she said, could be a slew of lawsuits.
“You only need one bad legal case to change the entire AI landscape, and this is true for any new health care-related technology,” said Chen, who is also a former neuroradiology adjunct clinical faculty member at Stanford Hospitals and Clinics. “So, any AI company with the intention to replace an entire profession needs to tread cautiously into this mine field.”
The challenges this trend faces are two-fold, she said. First, as an industry, radiology’s reporting style is inconsistent, making data extraction for imaging interpretations complex and confusing. Imaging acquisition also isn’t standardized. It will likely be difficult for a computer to recognize many variables, including positioning, motion artifacts, and anatomical variants.
Regardless of these potential problems, progress toward wider-spread AI use has picked up within the past year, Siegel said, opening up many more possibilities for improving radiology practice, reducing diagnostic errors, and enhancing patient care. Still, he said, the process is just beginning.
“We’re really just in the first baby steps, and we need to keep learning,” Siegel said. “The impact has been relatively small so far, but I have confidence with emerging companies and start-ups that we’ll see it grow.”