Generalized Biped Walking Control
Contribution:
Evaluation:
Reproducibility:
Improvements:
--
MichielVanDePanne - 24 Nov 2011
(a) What is the contribution of the paper?
The authors present a generalized real-time control strategy for walking bipedal characters. Specific tuning with respect to parameters or characters is not required, the controller is robust to disturbances, and control strategies can be successfully authored by non-expert users. The research integrates a series of constituent components that have existed in the control literature for 16 years.
(b) How are the results evaluated?
Results are provided in the form of two videos. Results show that the proposed walking control models were able to generalize across (1) gait parameters (both forward and backward), (2) styles (authored by novice users), (3) characters (with arbitrary proportions) and (4) tasks (such as reaching, moving a crate, navigating around obstacles/stairs and in crowds).
(c) Is the paper reproducible?
Yes. Source code is provided by the authors, along with all implementation details (parameters, etc.) required to reproduce the simulation in ODE.
(d) How could the paper research or paper writing be improved?
No complaints here.
--
DanielTroniak - 24 Nov 2011
Contribution
This study combines four components in a novel way that allows the proposed model to achieve a control mechanism for physically-simulated walking motions. As a result, the controller generalizes across gait parameters, motion styles, character proportions, and a variety of skills.
It also provides a user-friendly tool such that even novice users can create a character of desired proportions and see the resulting motion for that character immediately.
Evaluation
The results are evaluated with different gait parameters, i.e. walking forwards-backwards, varying walking speeds, and stepping frequencies, and turning towards a desired direction. The study shows the control over gait parameters for characters with different body types and for different styles. Moreover, generalization of several motion styles are also shown with the accompanying videos. It is alsoshown that the framework is capable of generalizing across a variety of characters with varying proportions and tasks such as reaching, pulling/pushing a crate, etc. The user interface for the authoring of the motion styles and character proportions is also tested by novice users for its ease of usability.
The framework is tested under different parameters, such as with doubled and halved PD-gains, varying alpha values for the IPM. Lastly, contributions of each control component is analyzed and shown in the provided videos.
Reproducibility
The framework is reproducible, as the source code is open-source, and the methodology of the implementation is detailed enough to follow.
Improvements
In general, the paper is well-organized. The sections are divided meaningfully, and easy to follow.
ozgur
-- Main.ooguz - 24 Nov 2011
Topic revision: r3 - 2011-11-24
- ooguz