Infoscience

Conference paper

Learning from demonstrations with partially observable task parameters

Robot learning from demonstrations requires the robot to learn and adapt movements to new situations, often characterized by position and orientation of objects or landmarks in the robot’s environment. In the task-parameterized Gaussian mixture model framework, the movements are considered to be modulated with respect to a set of candidate frames of reference (coordinate systems) attached to a set of objects in the robot workspace. Following a similar approach, this paper addresses the problem of having missing candidate frames during the demonstrations and reproductions, which can happen in various situations such as visual occlusion, sensor unavailability, or tasks with a variable number of descriptive features. We study this problem with a dust sweeping task in which the robot requires to consider a variable amount of dust areas to clean for each reproduction trial.

    Reference

    • EPFL-CONF-198846

    Record created on 2014-05-19, modified on 2016-08-09

Related material