6533b820fe1ef96bd127a2b6
RESEARCH PRODUCT
Multimodal user interface for a semi-robotic visual assistance system for image guided neurosurgery
T. LutzeKlaus RadermacherAxel PerneczkyS. SerefoglouWolfgang Lauersubject
Engineeringbusiness.product_categoryMicrophonebusiness.industryComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONOptical head-mounted displayStereoscopyGeneral MedicineVirtual realityMechatronicsImaging phantomlaw.inventionlawComputer visionArtificial intelligenceUser interfacebusinessSimulationDigital cameradescription
Abstract We developed a visual assistance system for image guided neurosurgery, consisting of a stereoscopic digital camera (exoscope) mounted on a semi-robotic manipulator. In order to minimize the operation time, the application-specific multimodal user interface enables hands-free manipulation of the exoscope. The surgeon wears thereby a head-mounted unit with a binocular display, a head tracker, a microphone and earphones. Different modes of view positioning and adjustment can be selected by voice and controlled by head rotation while pressing a miniature confirmation button mounted on a finger ring or suction device. Apart from the development of the mechatronic and software modules of the semi-robotic manipulator, initial studies focused on the evaluation and optimization of the intuitiveness, comfort and precision of different modes of operation. For the user based evaluation of different control modes, a simulation has been implemented in stereoscopic virtual reality. Results of user tests with neurosurgeons are presented in this paper. Based on these findings, the multimodal user interface has been implemented into the first labtype. The system has been demonstrated in the operating room in first tests on a phantom together with the clinical partner.
year | journal | country | edition | language |
---|---|---|---|---|
2005-05-01 | International Congress Series |