Multisensory learning cues using analytical collision detection between a needle and a tube
We are developing a Virtual Reality based training system for micromanipulation in collaboration with the National University Hospital in Singapore. While conventional approaches judge a virtual environment by its resemblance to the real environment, we use simple environments with only selected features of the real task, and develop fast algorithms to investigate the learning of dexterity primitives using various multi-sensory cues. For the needle maneuvering primitive, this paper introduces a method using stereographic projection to compute the distance between the curved needle and a curved tube, necessary to investigate multi-sensory cues systematically. This analytical algorithm is shown to be faster by orders of magnitude than numerical ones. The computation time barely increases with increasing precision, a critical condition to simulate the microworld.