OR WAIT null SECS
Emerging research suggests combined artificial intelligence (AI) assessment of digital mammography and automated 3D breast ultrasound provides enhanced detection of breast cancer in women with dense breasts and may be a viable alternative in areas where radiologists are scarce.
For women with dense breasts, combining FDA-approved artificial intelligence (AI) systems for 3D breast ultrasound and digital mammography may offer improved breast cancer detection than either modality alone, and provide possible equivalence to radiologist assessment.
In a retrospective study involving 430 Asian women with predominantly dense breasts, researchers compared stand-alone AI systems for automated 3D breast ultrasound (QVCAD 3.4, QView Medical) and digital mammography (Transpara® 1.7.0, ScreenPoint Medical) and a combination of the two AI systems for breast cancer detection.
The researchers found that the multimodal AI model had an 81.1 percent sensitivity rate and a 95.5 percent specificity rate in comparison to a 77.2 percent sensitivity rate and an 89.1 percent specificity rate for AI assessment of digital mammography (AI-DM), and an 86.1 percent sensitivity rate and 82.7 percent specificity rate for AI assessment of automated 3D breast ultrasound (AI-ABUS), according to the recently published study in Insights into Imaging.
In a subset analysis of 152 cases that compared multimodal AI versus two radiologist readers of digital mammography (DM) and two radiologist readers of automated breast ultrasound (ABUS), researchers found that multimodal AI had sensitivity (81.1 percent) that was higher than that of radiologist readers of DM (77.8 and 79.2 percent) but lower than that of the radiologist readers of ABUS (94.2 and 85.4 percent). However, the study authors also noted a nearly threefold higher specificity for multimodal AI (95.5 percent) in comparison to the consensus specificity for radiologist assessment of DM or ABUS (32.7 percent).
“Without the involvement of radiologists, incorporating AI results from ABUS into an existing AI system for mammograms boosts the diagnostic performance and outperforms the performance of single readers on mammography. This appears to be independent from the presence of biopsied benign lesions in the data set,” wrote lead study author Tao Tan, M.D., who is affiliated with the Department of Radiology at the Netherlands Cancer Institute, and colleagues.
(Editor’s note: For related content, see “What a New Study Reveals About Breast Density Awareness” and “Current Insights on Breast Density, Contrast-Enhanced Mammography and Supplemental Breast Cancer Screening.”)
Emphasizing the improved AI performance of the weighted multimodal combination of AI-DM and AI-ABUD as well as comparable equivalance to radiologist assessment, the researchers suggested the multimodal AI model may have viable stand-alone potential in areas with limited access to radiologists.
“This may enhance the complementary value of AI in reading multi-modal screening examinations and may provide a possibility to use AI as a stand-alone reader in (radiologist-scarce) regions with populations with dense breasts that could benefit with ABUS,” pointed out Tan and colleagues.
The researchers conceded a possible “study effect” due to the use of enriched data sets in the subset analysis comparing multimodal AI to radiologist readers. Tan and colleagues also acknowledged the limitations of a single-center study and emphasized that future multicenter studies are needed to validate the study findings in a broader population of women with dense breasts.