go back

Online and markerless motion retargetting with kinematic constraints

Behzad Dariush, Michael Gienger, Arjun Arumbakkam, Christian Goerick, Youding Zhu, Kikuo Fujimura, "Online and markerless motion retargetting with kinematic constraints", IEEE International Conference on Intelligent Robot and Systems, 2008.

Abstract

Transferring motion from a human demonstrator to a humanoid robot is an important step toward developing robots that are easily programmable and that can replicate or learn from observed human motion. The so called motion retargeting problem has been well studied and several offline solutions exist based on optimization approaches that rely on pre-recorded human motion data collected from a marker-based motion capture system. From the perspective of human robot interaction, there is a growing interest in online and marker-less motion transfer. Such requirements have placed stringent demands on retargeting algorithms and limited the potential use of off-line and pre-recorded methods. To address these limitations, we present an online task space control theoretic retargeting formulation to generate robot joint motions that adhere to the robot’s joint limit constraints, selfcollision constraints, and balance constraints. The inputs to the proposed method include low dimensional normalized human motion descriptors, detected and tracked using a vision based feature detection and tracking algorithm. The proposed vision algorithm does not rely on markers placed on anatomical landmarks, nor does it require special instrumentation or calibration. The current implementation requires a depth image sequence, which is collected from a single time of flight imaging device. We present online experimental results of the entire pipeline on the Honda humanoid robot - ASIMO.



Download Bibtex file Download PDF

Search