Sznitman, RaphaelAli, KarimRicha, RogerioTaylor, RussellHager, GregoryFua, Pascal2012-08-092012-08-092012-08-09201210.1007/978-3-642-33418-4_70https://infoscience.epfl.ch/handle/20.500.14299/84474In the context of retinal microsurgery, visual tracking of instruments is a key component of robotics assistance. The difficulty of the task and major reason why most existing strategies fail on {\it in-vivo} image sequences lies in the fact that complex and severe changes in instrument appearance are challenging to model. This paper introduces a novel approach, that is both data-driven and complementary to existing tracking techniques. In particular, we show how to learn and integrate an accurate detector with a simple gradient-based tracker within a robust pipeline which runs at framerate. In addition, we present a fully annotated dataset of retinal instruments in {\it in-vivo} surgeries, which we use to quantitatively validate our approach. We also demonstrate an application of our method in a laparoscopy image sequence.Data-Driven Visual Tracking in Retinal Microsurgerytext::conference output::conference proceedings::conference paper