000117918 001__ 117918
000117918 005__ 20190316234136.0
000117918 0247_ $$2doi$$a10.1142/S0219843608001376
000117918 02470 $$2DAR$$a13752
000117918 02470 $$2ISI$$a000260488500002
000117918 037__ $$aARTICLE
000117918 245__ $$aOnline learning of the body schema
000117918 269__ $$a2008
000117918 260__ $$c2008
000117918 336__ $$aJournal Articles
000117918 520__ $$aWe present an algorithm enabling a humanoid robot to visually learn its body schema, knowing only the number of degrees of freedom in each limb. By “body schema” we mean the joint positions and orientations and thus the kinematic function. The learning is performed by visually observing its end-effectors when moving them. With simulations involving a body schema of more than 20 degrees of freedom, results show that the system is scalable to a high number of degrees of freedom. Real robot experiments confirm the practicality of our approach. Our results illustrate how subjective space representation can develop as a result of sensorimotor contingencies.
000117918 6531_ $$aKinematic learning; tool use adaptation; body schema;
000117918 6531_ $$aperipersonal space
000117918 6531_ $$arepresentation; multimodal integration; developmental robotics.
000117918 700__ $$0240389$$aHersch, M.$$g114245
000117918 700__ $$0240845$$aSauser, E.$$g119102
000117918 700__ $$0240594$$aBillard, A.$$g115671
000117918 773__ $$j5$$k2$$q161–181$$tInternational Journal of Humanoid Robotics
000117918 8564_ $$zURL
000117918 8564_ $$s875522$$uhttps://infoscience.epfl.ch/record/117918/files/IJHR_0502_P161.pdf$$zn/a
000117918 909C0 $$0252119$$pLASA$$xU10660
000117918 909CO $$ooai:infoscience.tind.io:117918$$pSTI$$particle$$qGLOBAL_SET
000117918 937__ $$aLASA-ARTICLE-2008-011
000117918 973__ $$aEPFL$$rREVIEWED$$sPUBLISHED
000117918 980__ $$aARTICLE