Laparoscopic surgery is routine for many surgeries, including gallbladder and appendix removal. In these minimally invasive procedures, surgical instruments are inserted into a patient’s body, along with a fiber-optic camera and lighting system, through a five-millimeter incision. Laparoscopic surgery can reduce complications and speed patient recovery, but it presents unique challenges for surgeons.
Unlike open surgery, laparoscopy limits a surgeon’s visual field, which compromises hand-eye coordination. It also precludes the haptic, or tactile, feedback that tells a surgeon when an instrument is headed in the wrong direction.
The National Science Foundation has awarded $1.9 million over four years to principal investigator Jerzy Rozenblit, University of Arizona Distinguished Professor in the department of electrical and computer engineering with a joint appointment in the department of surgery, to develop and test his Computer-Aided Surgical Trainer.
The CAST system will be the first to provide haptic guidance and augmented reality images for trainees when directly manipulating surgical instruments.
“Nobody is developing haptic guidance technology like we are,” said Rozenblit, a former head of his department and the Raymond J. Oglethorpe Endowed Chair. “We anticipate the CAST system will speed up learning, reinforce good habits and techniques and discourage bad ones, and, ultimately lead to better surgical outcomes and improved patient safety.”
Rozenblit -- with considerable help from Tech Launch Arizona, the UA unit that helps commercialize technology -- learned in 2016 that his first patent for CAST is being issued. For the NSF study, his team will develop and test a fourth version of the prototype. Rozenblit continues to work with TLA to enhance the commercial potential of the technology.
Surgical Space Navigation
The system is designed for “surgical space navigation,” said Rozenblit, and features “no-fly zones.” When a trainee moves the instrument toward a no-fly zone, vibrational or other haptic signals indicate they must take a different course. Sophisticated graphics on a video monitor illustrate the trajectory of surgical instruments to help users avoid danger zones.
Rozenblit and his team of doctoral students developed the mathematical models that recognize, guide and evaluate CAST user performance. With remarkable precision, his algorithms help trainees navigate obstacles and avoid surgical collisions -- a serious problem in laparoscopic procedures, in which a surgeon’s tools collide with each other or with anatomical structures. His software also presents CAST trainees with unexpected events they might see during real operations and analyzes responses.
In this collaborative NSF study, Henry Fuchs, the Federico Gil Distinguished Professor of Computer Science and adjunct professor of biomedical engineering at the University of North Carolina at Chapel Hill and a renowned pioneer in virtual reality, will build an augmented-reality image-processing technology for CAST that can display holographic and enlarged images of organs and vessels in real time.
Verifying the Tech
Beyond developing more realistic and effective surgical training, a major goal of the NSF project is to prove CAST’s superiority over traditional surgical training.
Under the leadership of co-principal investigator Allan Hamilton, MD, a Harvard Medical School-trained neurosurgeon and professor of surgery in the UA College of Medicine, up to 100 UA medical students, residents and surgeons will be trained on the device starting in 2019. Their performance will be compared with that of trainees taught laparoscopy the old-fashioned way, by an experienced surgeon.
This pilot clinical training study will provide the first objective, data-based assessment of how well computer-aided surgical training teaches basic tasks, like grasping, cutting and suturing.
Rozenblit and Hamilton theorize their computerized system will win the contest -- hands down.
“Computers can detect and correct a surgeon’s hand movements far more precisely than we can,” Hamilton said. “With computer-assisted surgery, we won’t just say a surgeon has good hands; we’ll be able to say exactly how good those hands are, or whether they show signs of tremor or slowed response time that neither the surgeon nor their colleagues can recognize.”
Close to Home
Rozenblit will spend part of spring 2017 conducting research as a Fulbright Scholar in Poland, where he was raised by his parents, both physicians. He will help researchers at Wroclaw University of Technology build their own version of CAST and shadow surgeons in the operating rooms at Wroclaw Medical University to better understand their clinical challenges.
Researchers at Johns Hopkins Medicine reported in early 2016 that medical error, which causes more than 250,000 deaths every year, is the nation’s third leading cause of death, after heart disease and cancer.
“Our motivation for wanting to improve surgical training is obvious,” Rozenblit said. “If CAST can improve the outcome for even one patient, it will be a resounding success.”
Rozenblit predicts his system will be used in U.S. medical training centers within two years, and adapted for use in operating rooms within 20 years. He hopes to see the CAST software become available in an open-source platform, so students and surgeons around the world can access it from their own computers.
Collaborative Research: Computer Guided Laparoscopy Training is funded by the National Science Foundation Smart and Connected Health program under grant No. 1622589. The co-principal investigator is Allan Hamilton, MD, professor of surgery in the UA College of Medicine with joint appointments in the departments of radiation oncology, psychology, and electrical and computer engineering. The UA is collaborating with Henry Fuchs, the Federico Gil Distinguished Professor of Computer Science at the University of North Carolina at Chapel Hill, who will develop an augmented-reality imaging system for the project.