Robot Learning with Task-Parameterized Generative Models

Task-parameterized models provide a representation of movement/behavior that can adapt to a set of task parameters describing the current situation encountered by the robot, such as location of objects or landmarks in its workspace. This paper gives an overview of the task-parameterized Gaussian mixture model (TP-GMM) introduced in previous publications, and introduces a number of extensions and ongoing challenges required to move the approach toward unconstrained environments. In particular, it discusses its generalization capability and the handling of movements with a high number of degrees of freedom. It then shows that the method is not restricted to movements in task space, but that it can also be exploited to handle constraints in joint space, including priority constraints.

Presented at:
Proc. Intl Symp. on Robotics Research

 Record created 2015-12-19, last modified 2018-03-17

Rate this document:

Rate this document:
(Not yet reviewed)