Mechatronic elements and haptic rendering for computer-assisted minimally invasive surgery training
Technological advances of the last decades enabled the development of a set of new medical techniques that differ from traditional open surgery. These techniques, referred to as minimally invasive surgery (MIS), involve complex instruments and imaging devices to reach and treat anatomical regions through small incisions or natural entry points resulting in reduced trauma and shorter recovery time. For the surgeons, MIS introduces unnatural hand-eye coordination and anatomical representation that require specific training. One alternative is the use of virtual reality (VR) simulations of MIS procedures coupled to input devices mimicking instruments used by surgeons. To achieve high-fidelity with such a computer-assisted training system, the input device also provides the sensations experienced during a procedure with force reflective technology, often referred to as haptics. This work explores the mechatronic aspects implied in the realization of such a haptic device for MIS training applications. An analysis of different MIS procedures enables to draw two categories: lumen-guided and cavity procedures. A simple classification of the required mechanism can be extracted from these observations. To fulfil the requirements, an implementation should follow a series of guidelines in terms of actuation and power transmission that are presented. The interactions between a human operator and a haptic interface involve both force and motion. Impedance control generates a force command based on the position of the haptic interface. This process, called haptic rendering, is an integrate part of the VR environment. This work proposes to extend impedance control with model-based compensation of the dynamics and non-linearities (such as friction) of the haptic device. Due to sensitivity to model uncertainties, the proposed control architecture is completed by a parallel implicit force controller. Since the computational load of VR simulation for complex anatomies is high, update rates reach 20 to 30 Hz. These frequencies are not sufficient to control a haptic device and meet the sensitivity requirements of the human sensory system. Therefore, in such system a multirate approach is introduced. A model is proposed for the dynamics of a haptic interface and its user. Based on this model, the effect of the introduced implicit force controller is studied and different interfacing techniques between the VR simulation and the haptic controller are evaluated. Since force sensing is a requirement for the implementation of the proposed implicit forcecontroller, a technique using infrared reflective sensors is proposed. A modelling method for such sensors is introduced and implementation issues for force sensing are discussed. Finally, the topics discussed in this work are applied to the development of a computer-assisted training system for Interventional Radiology.
Section de microtechnique
Faculté des sciences et techniques de l'ingénieur
Institut de production et robotique
Jury: Dwight Meglan, Alfred Rufer, Paul Xirouchakis, Jurjen Zoethout
Public defense: 2005-9-16
Record created on 2005-07-12, modified on 2016-08-08