Force feedback shows potential for tissue segmentation and interventional planning
Engineers are always on the lookout for new and better ways to present data. So it's been with the transition from 2D to 3D and, lately, to 4D, with the capture of dynamic organs or the display of large volumes of data. But even the most advanced of these techniques, whether fly-throughs of the colon or 4D fetal imaging, may not be the ultimate means for analyzing medical images.
A doctoral student at Uppsala University in Sweden has come up with a workstation that allows radiologists to "feel" as well as see radiologic images.
Erik Vidholm uses force feedback, also called haptics, to translate tissue characteristics into "feeling." This sense of touch is transmitted to the user through a haptic pen built into a workstation that displays stereo graphics. An experimental workstation, constructed by Vidholm as part of his doctoral research, allows operators to see and probe the contours of organs.
Working at the Center for Image Analysis at Uppsala University's Center, Vidholm has focused on the use of virtual touch as a way to speed the segmentation of reconstructed tissues by providing information about imaging data that are hard to visualize. Collaborators at the department of oncology, radiology, and clinical immunology at Uppsala University Hospital and the ITEE biomedical engineering group at the University of Queensland in Brisbane, Australia, provided the imaging data and assisted in the evaluation and validation of results from his haptic postprocessing.
Early testing has produced encouraging results, with liver CT data and breast MR data reconstructed volumetrically. In this work, the haptic rendering of medical images has allowed the user to feel boundaries between tissues and provide input to segmentation algorithms. Ultimately, the technology might find a place in planning surgery or radiation therapy, but this is a long way off. Vidholm's work has shown only the potential for haptics, a potential that remains to be quantitatively evaluated and developed.
Emerging AI Algorithm Shows Promise for Abbreviated Breast MRI in Multicenter Study
April 25th 2025An artificial intelligence algorithm for dynamic contrast-enhanced breast MRI offered a 93.9 percent AUC for breast cancer detection, and a 92.3 percent sensitivity in BI-RADS 3 cases, according to new research presented at the Society for Breast Imaging (SBI) conference.
Could AI-Powered Abbreviated MRI Reinvent Detection for Structural Abnormalities of the Knee?
April 24th 2025Employing deep learning image reconstruction, parallel imaging and multi-slice acceleration in a sub-five-minute 3T knee MRI, researchers noted 100 percent sensitivity and 99 percent specificity for anterior cruciate ligament (ACL) tears.
New bpMRI Study Suggests AI Offers Comparable Results to Radiologists for PCa Detection
April 15th 2025Demonstrating no significant difference with radiologist detection of clinically significant prostate cancer (csPCa), a biparametric MRI-based AI model provided an 88.4 percent sensitivity rate in a recent study.