A 3D whole brain convolutional neural network could provide enhanced sensitivity and specificity for diagnosing intracranial hemorrhages on computed tomography, according to new research presented at the Society for Imaging Informatics in Medicine (SIIM) conference in Kissimmee, Fla.
Radiology researchers from Emory University suggested that the use of a 3D whole brain convolutional neural network (CNN) may not only improve the diagnosis of intracranial hemorrhages, but it could facilitate the classification of subarachnoid hemorrhages (SAHs) on non-contrast computed tomography (CT) as well.
In a new poster abstract presentation at the Society for Imaging Informatics in Medicine (SIIM) conference in Kissimmee, Fla., the study authors said their 3D whole brain CNN demonstrated an area under the curve (AUC) of 0.98, 93 percent sensitivity and 94 percent specificity in detecting intracranial hemorrhages.
Acknowledging previous research that demonstrated the relationship between SAH and aneurysmal rupture location, the study authors were intrigued by the potential ability of a CNN to diagnose intracranial hemorrhages and differentiate between aneurysmal and non-aneurysmal SAHs.
“ … The additional spatial context afforded by a 3D neural network architecture may enable downstream classification of aneurysmal subarachnoid hemorrhages (aSAHs) versus non-aneurysmal subarachnoid hemorrhages (naSAHs),” wrote Ranliang Hu, M.D., the director of stroke imaging and the associate program director of the neuroradiology fellowship in the Division of Neuroradiology at the Emory University School of Medicine, and colleagues.
Accordingly, the researchers trained their 3D whole brain CNN with a combination of an intracranial hemorrhage data set from the Radiological Society of North America (RSNA) and institutional data sets of catheter angiography proven aneurysmal subarachnoid hemorrhages (aSAH) and non-aneurysmal subarachnoid hemorrhages (naSAH).
After refining their CNN model to test for the detection of naSAH and aSAH, Hu and colleagues found an AUC of .95, 89 percent sensitivity and 86 percent specificity for diagnosing naSAH. The CNN model also had an AUC of .83, 77 percent sensitivity and 81 percent specificity for diagnosing aSAH, according to the poster abstract.
Can Deep Learning Ultra-Fast bpMRI Have an Impact in Prostate Cancer Imaging?
March 3rd 2025A deep learning-enhanced ultra-fast bpMRI protocol offered similar sensitivity for csPCa as mpMRI with an 80 percent reduction in scan time, according to research findings presented at the European Congress of Radiology (ECR) conference.
ECR Mammography Study: Pre-Op CEM Detects 34 Percent More Multifocal Masses than Mammography
February 28th 2025In addition to contrast-enhanced mammography (CEM) demonstrating over a 90 percent detection rate for multifocal masses, researchers found that no significant difference between histological measurements and CEM, according to study findings presented at the European Congress of Radiology.