• AI
  • Molecular Imaging
  • CT
  • X-Ray
  • Ultrasound
  • MRI
  • Facility Management
  • Mammography

Q&A: Radiology Department Tests Artificial Intelligence

Article

UVA Health System conducts trial on artificial intelligence integrated with PACS.

Through machine learning, smart tools, and computer-aided detection, radiology has a rich history of using digital tools to aid the radiologist in the quest for the most accurate -- and fastest -- diagnosis. And, that search continues.

The latest effort comes in the form of artificial intelligence that can be integrated with your PACS, and it works like a first set of eyes before you read a study more thoroughly. Via an initial analysis, based on specifically-designed algorithms, the software can flag studies with potential abnormalities, saving you time, and, potentially, catching problems you might overlook.

The University of Virginia Health System is currently conducting a trial into the efficacy of this software, provided by Carestream. Diagnostic Imaging recently spoke with the leader of the trial Cree Gaskin, MD, professor, chief of musculoskeletal imaging and intervention, and associate chief medical informatics officer about the health system's experience to date and how it could impact your efficiency and workflow.[[{"type":"media","view_mode":"media_crop","fid":"62895","attributes":{"alt":"Cree Gaskin, MD","class":"media-image media-image-right","id":"media_crop_2682971681515","media_crop_h":"0","media_crop_image_style":"-1","media_crop_instance":"8013","media_crop_rotate":"0","media_crop_scale_h":"0","media_crop_scale_w":"0","media_crop_w":"0","media_crop_x":"0","media_crop_y":"0","style":"float: right; height: 230px; width: 170px;","title":"Cree Gaskin, MD","typeof":"foaf:Image"}}]]

Diagnostic Imaging: Why was this technology deemed as necessary or useful for radiologists?

Gaskin: We're always looking for ways to do things better. Our workflow happens to be electronic. So, there are a lot of opportunities to apply evolving technology to enhance what we do. One of the greatest challenges in radiology is to interpret a large number of images and not miss relevant imaging findings. If we can use technology to help us be more consistent and get more out of images, that makes sense.

Diagnostic Imaging: How is this artificial intelligence (AI) integrated with the PACS?

Gaskin: A study is performed on the CT scanner, the images go to the PACS, and a copy is sent to a separate server where the AI engine sits. It interprets the image and stores results. When the radiologist opens that study for interpretation, the PACS communicates with the third-party server and the AI results are delivered. An icon overlays on the PACS or desktop and lets you know the AI results are available. If you're reading three studies in a row, the first one might not have AI results, but the next case might display the icon. The icons are color-coded so you can understand them at a glance. A green icon means everything is fine -- there are no abnormalities. So, you don't have to spend but a fraction of a second recognizing the green color and going on with your workflow. Red means that the AI algorithm has produced abnormal results. Click on the icon, and you can see the list of findings that it has found. Then, you can make a decision about whether to include those findings in your report or not.

The algorithms collectively need time to interpret images. If you open a study immediately after it reaches PACS, the results won't be back. It takes about 5 or 6 minutes for the results to be recorded. 

If the algorithms determine findings before a radiologist opens the case for reading, there is potential opportunity for the AI platform to inform the PACS or radiologist of an urgent finding, thus influencing study prioritization, or which exam is read first. For example, we're hoping in the near future that an algorithm will be able to let you know if a head CT might have an urgent finding. If it identifies blood on the brain, it could add a warning flag to your work list. You would read this study next. We don’t have this functionality yet, but it’s anticipated in the relatively near future.

Right now, you're very limited in the kind of study you can read with AI-algorithm based support. The algorithms are few and very specific, each targeting a specific finding. If you're reading a chest CT, you may look at numerous different imaging features, but the algorithms can collectively only look at a few of them. Our limited set of algorithms currently produce assessments for emphysema in the lungs, fatty change in the liver, coronary artery disease or calcification, and bone density measurements. Currently, our algorithms only work on CTs of the chest, abdomen, or pelvis. Another one is in the works for wedge compression fractures in the spine.

Diagnostic Imaging: How does it identify findings that radiologists could miss?

Gaskin: We're in the early days of the technology and radiologists are already very good at identifying most of the things that I've mentioned, except for bone mineral density. The algorithms might make us less likely to miss, though we will typically make the findings on our own. If you're reading a chest CT for lung cancer, your eyes are focused on looking for evidence of malignancy, and you might be less likely to notice that there's coronary artery calcification that is suggestive of coronary artery disease. We can get distracted when reading numerous images with multiple findings. You're less likely to miss something if it shows up on a list of positive findings.

Abnormal bone mineral density is one example of a diagnosis a radiologist may miss when reading a chest CT. Historically, we've not been focused on osteoporosis when reading a chest CT, and it can be difficult to determine subjectively. But the algorithm does a computational assessment and estimates bone density. That's information we would not normally get while looking at a study. This can add value. As the number of algorithms grows, we expect additional value out of this approach. Mammography is a potential example. We know mammography is excellent at breast cancer detection, but it's imperfect. Individuals can miss things, and a future algorithm may make it more likely that a breast cancer will be successfully identified.

Diagnostic Imaging: What are the challenges or stumbling blocks to using AI?

Gaskin: It's important to be sure the AI is integrated smoothly into the workflow. At UVA, with our industry partner, we're largely past this point. The technology is unobtrusive and doesn’t slow us down. Anyone new will have this hurdle – the need to work with existing clinical vendors and integrate into established clinical workflow.

Our current algorithms only state that an abnormality is present, which is fine for the limited types of abnormalities they are identifying. As we move to algorithms that identify specific focal lesions, it will be important that our user interface can successfully highlight lesions in an efficient manner.

The greatest stumbling block is developing numerous highly-accurate algorithms that each produce clinical impact, or they may not be worthwhile. Radiologists are very good at what they do, so it’s a challenge to develop algorithms that add to their skill and effort.

Diagnostic Imaging: To date, what have the results been in the field?

Gaskin: The impact to date is that some of our clinicians have been influenced by radiology reports that include AI results. I think the clearest example is bone density. Our radiologists often read CTs for trauma or oncologic indications, and the AI algorithm alerts them that the patient may have osteoporosis, though they may not have otherwise made the diagnosis. The radiologist’s report may include recommendation for DEXA to more formally assess for osteoporosis. This may lead to earlier diagnosis and treatment of osteoporosis than would have taken place without the algorithm.

Diagnostic Imaging: What are the next steps for AI technology?

Gaskin: We do have algorithms that pop-up results when interpreting images, but it would be great if they could also help us prioritize which study to read next. I mentioned the example earlier, if the algorithm finds a bleed in the brain, then the PACS could be notified of an urgent finding, or perhaps a radiologist could be auto-texted with a message. This could inform the radiologist to read this study next. Having an algorithm that identifies and warns about a positive brain bleed would help prioritize completed studies for interpretation.

Another next step for us would be a pop-up that points to specific findings that help us see lesions, such as a breast cancer or lung nodule, rather than simply notifying us of more general findings that don’t require such a specific pointer.

It’s also critical that this developing technology be accurate enough that it’s not distracting. Some radiologists report concerns with traditional CAD in mammography with the computer pointing out too many false positives. The distraction can be inefficient if not counter-productive. A very good mammographer may read a huge number of screenings before finding a breast cancer. If he or she has to stop and review a potential positive finding on every mammogram -- and they all turn out not to be breast cancer -- then, it just slows down and distracts the radiologist. Radiologists seek something that is more accurate than what is available today. The promise, and hopefully not hype, of AI is that we will eventually get image interpretation algorithms that significantly outperform historical CAD.

Diagnostic Imaging: What feedback have you received?

Gaskin: When radiologists see this technology integrated into their workflow, they're surprised at how smooth and unobtrusive it can be for AI-generated results to be available to them during their interpretation. It doesn't add additional burden. It's tangible, and, thus, helps them see how this type of technology can assist them now or in the future.  Even though the algorithms are limited, it's a taste of what's to come in the future. It has some clinical impact now, and I think people can see that this can work for them. This helps them to generate ideas for additional algorithms they'd like to see developed and implemented.

Recent Videos
Related Content
© 2024 MJH Life Sciences

All rights reserved.