Researchers at Rensselaer Polytechnic Institute, Harvard Medical School, Albany Medical Center and the Massachusetts Institute of Technology are collaborating to develop a new approach to surgical training—a virtual simulator that will allow surgeons to handle computer-generated organs with actual tools used in minimally invasive surgery.
Based on haptics, a science focused on the sense of touch, the new simulator is intended to provide an immersive environment for surgeons to touch, feel, and manipulate computer-generated 3-D tissues and organs with instruments handled in actual surgery. Not only might this be useful for surgical training, but it could provide a standardized assessment of surgical skill.
“The most important single factor that determines the success of a surgical procedure is the skill of the surgeon,” notes Suvranu De, assistant professor of mechanical, aerospace, and nuclear engineering and director of the Advanced Computational Research Lab at Rensselaer.
The researchers published a description of their new computational technique in the June/July issue of the journal Presence. Beginning in the summer of 2006, their work will be supported by a $1.4 million, four-year grant from the NIH (National Institutes of Health). This funding will extend the original three-year exploratory NIH grant De received in 2004 to support the initial phases of the research.
Surgical simulators, even more than flight simulators, are based on intense computation. To program the realism of touch feedback from a surgical probe navigating through soft tissue, the researchers must develop efficient computer models that perform 30-times faster than real-time graphics, solving complex sets of partial differential equations about a thousand times a second.
One major challenge to current technologies is the simulation of soft biological tissues, according to De. Such tissues are heterogeneous and viscoelastic, meaning they exhibit characteristics of both solids and liquids—similar to chewing gum or silly putty. And surgical procedures such as cutting and cauterizing are almost impossible to simulate with traditional techniques.
To overcome these barriers, Des group has developed a new computational tool that models human tissue as a collection of particles with distinct, overlapping zones of influence that produce coordinated, elastic movements. A single point in space models each spot, while its relationship to nearby points is determined by the equations of physics. The localized points migrate along with the tip of the virtual instrument, much like a roving swarm of bees. This method enables the program to rapidly perform hundreds of thousands of calculations for real-time touch feedback, making it superior to other approaches, according to the researchers.
“Our approach is physics-based,” De said. “The technologies that are currently available for surgical simulation are mostly graphical renderings of organs, and surgeons are not very happy with them.”
The team plans to develop initial prototype technology to be tested by surgeons and surgical residents at the Carl J. Shapiro Simulation and Skills Center at Beth Israel Deaconess Medical Center, a teaching hospital of Harvard Medical School. Upon the development of a successful prototype, researchers hope to apply the model to a much wider class of medical procedures.
“The grand vision,” De said, “is to develop a palpable human—a giant database of human anatomy that provides real-time interactivity for a variety of uses, from teaching anatomy to evaluating injuries in a variety of scenarios. In the long run, a better simulator could even help in the design of new surgical tools and techniques.”