Disney’s Research Hub has released a video to represent the ideas presented in a recent paper, which shows how animal motions and animations can be transferred onto legged robots to help them perform highly dynamic motions.
The paper calls this Differentiable Optimal Control (DOC), which works by interfacing with either motion capture or animation data to formulate retargeting objectives in robots to minimise retargeting errors.
Human- or animal-robot motion retargeting is used to allow robots to follow the motion of a subject.
To proof the modelling, the research team applied DOC to a Model-Predictive Control (MPC) formulation, showing the results of retargeting efforts for several robots of varying proportions and mass distribution.
By completing a hardware deployment, the team was able to show that the retargeted motions are physically feasible with robots.
A team spokesperson added that use of “MPC ensures that the robots retain their capability to react to unexpected disturbances”.
The project was completed in partnership with ETH Zürich and ANYbotics robots.