Robotic tools that could one day perform surgery on battlefields, in space, and at remote locations with minimal human guidance use 3D ultrasound as a key component.
Engineers at Duke University call the results of feasibility studies conducted in their laboratory the first concrete steps toward achieving this space age vision of the future. On a more immediate level, the technology they've developed could make certain contemporary medical procedures safer for patients, they said.
For their proof-of-concept experiments, the engineers started with a rudimentary tabletop robot guided by novel 3D ultrasound technology developed in the Duke laboratories. An artificial intelligence program served as the robot's brain by taking real-time 3D information, processing it, and giving the robot specific commands to perform.
"In a number of tasks, the computer was able to direct the robot's actions," said Stephen Smith, director of the Duke ultrasound transducer group. "Given that we achieved these early results with a rudimentary robot and a basic artificial intelligence program, the technology will advance to the point where robots — without the guidance of the doctor — can someday operate on people."
The results of a series of experiments on the robot system directing catheters inside synthetic blood vessels was published online in IEEE Transactions on Ultrasonics, Ferroelectrics and Frequency Control (2008;55:1143-1145). A second study, published in April in Ultrasonic Imaging, demonstrated that the autonomous robot system could successfully perform a simulated needle biopsy.
Advances in ultrasound technology have made these latest experiments possible by generating detailed, 3D moving images in real-time, the researchers said. In the latest experiment, the robot successfully performed its main task: directing a needle on the end of the robotic arm to touch the tip of another needle within a blood vessel graft. The robot's needle was guided by a tiny 3D ultrasound transducer, the "wand" that collects the 3D images, attached to a catheter commonly used in angioplasty procedures.
"The robot was able to accurately direct needle probes to target needles based on the information sent by the catheter transducer," said John Whitman, a senior engineering student in Smith's laboratory and first author on both papers. "The ability of the robot to guide a probe within a vascular graft is a first step toward further testing the system in animal models."
The other Duke members of the team were Matthew Fronheiser and Nikolas Ivancevich. Research in Smith's lab is supported by the National Institutes of Health.
While researchers will continue to refine the ability of robots to perform independent procedures, the new technology could also have more direct and immediate applications.
"Currently, cardiologists doing catheter-based procedures use fluoroscopy, which employs radiation, to guide their actions," Smith said. "Putting a 3D ultrasound transducer on the end of the catheter could provide clearer images to the physician and greatly reduce the need for patients to be exposed to radiation."
In the earlier experiments, the tabletop robot arm successfully touched a needle on the arm to another needle in a water bath. It then performed a simulated biopsy of a cyst fashioned out of a liquid-filled balloon in a medium designed to simulate tissue.
"These experiments demonstrated the feasibility of autonomous robots accomplishing simulated tasks under the guidance of 3D ultrasound, and we believe that it warrants additional study," Whitman said.
Adding this 3D capability to more powerful and sophisticated surgical robots already in use at many hospitals could hasten the development of autonomous robots that could perform complex procedures on humans, the researchers said.
For more information from the Diagnostic Imaging archives: