The widespread, deeply integrated use of artificial intelligence (AI) tools throughout all of radiology is still years—perhaps decades—away. But, the initial steps to using this technology are already in place.
According to neuroradiology leaders at this year’s American Society of Neuroradiology annual meeting in Vancouver, both patients and providers are already experiencing positive impacts even if the technology still has a way to go.
“Convolutional neural networks (CNN) are not quite ready for prime time at this point,” says Michael Lev, MD, director of emergency radiology and emergency neuroradiology at Massachusetts General Hospital. “It will not replace, but supplement radiology jobs. It will be a tool that radiologists will use for niche applications that can be done for certain detections.”
For the most part, he says, AI and machine learning will be used both as second-read and screening tools.
Potential real world uses
Based on the recently published DAWN and DEFUSE 3 studies in the New England Journal of Medicine, a clot thrombectomy performed within 16 to 24 hours of stroke onset can reduce a patient’s disabilities. CT and CTA scans can identify stroke patients with large vein occlusions, making them good clot thrombectomy candidates. However, not all hospitals provide advanced stroke services.
With a properly-trained AI platform, Lev says, facilities could quickly identify scans with these occlusions, moving them to the top of the must-read pile. Providers can then make an educated decision about whether to transfer patients to a tertiary-care center that can provide greater stroke care services. In effect, he says, using AI this way will make any community hospital a more integral partner in stroke care.
“We think we have the tools that can weed out a hemorrhage,” he says. “If it could detect a large vessel occlusion on CTA, we could transfer patients to a hospital that is better qualified to measure the size of an infarct and the next steps to take.”
Using AI in this way is a great step toward further augmenting providers’ abilities to offer greater patient care, Lev says.
“We are very close to having deep learning CNN systems that will be available to assist us in very well-defined, focused tasks that will help with prioritizing scans,” he says. “I think we’re still decades away from it being able to do what a radiologist does—to look at a whole scan and put it in clinical context and integrate it, but we’re very close to having even greater use of these tools.”
What ACR Is doing
According to Sumit Niogi, MD, assistant professor of radiology with Weill Cornell and New York-Presbyterian, the American College of Radiology (ACR) is still perfecting its ACR Assist tool to provide radiologists with an effective structured reporting tool. Efforts are currently underway, he says, to add machine learning capabilities for image feature recognition and AI for natural language processing.
“The ACR Assist can leverage these technologies and create both a structured report helpful for referring physicians to interpret and also to serve as an educational tool for radiologists,” he says.
Not only can the structured reports notify readers if critical information is missing, but it can also auto-populate fields and minimize errors. Additionally, with new Traumatic Brain Injury Reporting & Data System (TBI-RADs), these tools can provide evidence-based management recommendations.
The goal, Niogi says, is for AI and machine learning to help radiologists meet the daily challenges of maintaining their current workload, navigating frequent technology changes, and achieving new levels of quality service for patients.
Practicing providers can contribute to making these technologies better, he says. Contacting the ACR with your needs and wants will help the organization integrate your input, shaping the final evolution of ACR Assist.
Eventually, he hopes ACR Assist will be more than dictation software that provides structured reports.
“I have hopes that it can be much more as an educational tool for trainees,” he says. “I see it as a virtual assistant by integrating with RADS and imaging guidelines to provide evidence-based recommendations, and, as the name applies, assist the radiologist in any number of other ways.”
Even as these technologies continue to grow and expand their capabilities, neuroradiologists should only see them as supplemental tools that will help support their overall patient care efforts, Lev says.
“No one should worry about their jobs right now. These things are going to be tools, and they’re going to make us more efficient,” he says. “They’re going to let us do things that we don’t do as well right now. They will be complementary tools that will add to care.”