Imaging tools have long been standard in surgical procedures. But the result is usually a picture on a screen which the doctors have to interpret and then transfer to the patient. New virtual processes in surgery, however, can convert data from imaging tools into three-dimensional representations which can help with diagnostics and with planning and carrying out operations.
At yesterday’s annual event, the platform for cooperation in medicine, Hochschulmedizin Zürich (see box), unveiled its new flagship project for 2018. Surgent (Surgeon Enhancing Technologies) aims to set new standards in precision surgery. Under the auspices of the project, new technologies to improve surgical skills, initially in spinal and brain surgery, will be developed and clinically tested.
Researchers from different disciplines from ETH Zurich, the University of Zurich and the University’s hospitals will collaborate on the project. The project will be led by Mazda Farshad, a professor and director of Balgrist University Hospital, and Mirko Meboldt, professor of product development at ETH Zurich.
Eight research groups are involved in total, and in the coming years they intend to revolutionize the way operations are planned and carried out. The first step is to survey and document the individual anatomy and tissue of patients by recording images. That will enable the researchers to create interactive maps of the “landscapes” in which the surgeons later have to work.
Secondly, further models and simulations will be developed to enable optimal and patient-specific operation planning, as well as to make outcome prediction possible. This should make treatment results more reliable.
The third aim of Surgent is to use augmented reality (AR) to efficiently and effectively support navigation during an operation. With the AR, the surgeons will receive visual and acoustic information throughout the operation, expanding their senses. Artificial intelligence will also make it possible to analyze surgeons’ actions during operations in order to provide them with the right information at the right time.
Mixed-reality glasses are already in use as part of clinical trials for some operations at Balgrist University Hospital. The new flagship project will build on these experiences.
Mixed-reality glasses allow virtual information to be fed back to the surgeon’s view – reality and virtuality thus interact with each other. This means the surgeons have more information available to them which it was previously not possible to provide. For example, a holographic navigation using 3-D simulation could be provided during an operation.
If several vertebrae need to be repaired in one operation, for instance, the glasses would aid the doctor in positioning the screws correctly. A navigation platform communicates wirelessly with the glasses, and the planned position of the screws is then projected into the view of the operator. The mixed reality view, as well as 3D-position tracking, means the surgeon can more quickly locate the screw position and better position the surgical instruments during the operation. The surgeons’ hands are thus guided better through this extension of their sensory and intellectual capabilities.
This article by Adrian Ritter and Marita Fuchs appeared in UZH News.