Task Parameterization Using Continuous Constraints Extracted From Human Demonstrations

In this work we propose an approach for learning task specifications automatically, by observing human demonstrations. Using this allows a robot to combine representations of individual actions to achieve a high-level goal. We hypothesize that task specifications consist of variables that present a pattern of change that is invariant across demonstrations. We identify these specifications at different stages of task completion. Changes in task constraints allow us to identify transitions in the task description and to segment them into sub-tasks. We extract the following task-space constraints: (1) the reference frame in which to express the task variables, (2) the variable of interest at each time step, position or force at the end effector; and (3) a factor that can modulate the contribution of force and position in a hybrid impedance controller. The approach was validated on a 7 DOF Kuka arm, performing 2 different tasks: grating vegetables and extracting a battery from a charging stand.


Published in:
IEEE Transactions on Robotics, 31, 6, 1458 - 1471
Year:
2015
Publisher:
Piscataway, Institute of Electrical and Electronics Engineers
ISSN:
1552-3098
Keywords:
Laboratories:




 Record created 2016-02-17, last modified 2018-12-03

Postprint:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)