Morph: A Motion-free Physics Optimization Framework for Human Motion Generation
Zhuo Li, Mingshuang Luo, Ruibing Hou, Xin Zhao, Hao Liu, Hong Chang, Zimo Liu, Chen Li
2024-11-28

Summary
This paper introduces Morph, a new framework designed to generate realistic human motions for digital characters and robots without relying on real-world motion data.
What's the problem?
Generating human motion often leads to unrealistic movements, like characters floating or sliding their feet, because many existing methods don't consider the laws of physics. This can make animations look unnatural and unconvincing.
What's the solution?
Morph addresses this issue by using a two-part system: a Motion Generator that creates synthetic motion data and a Motion Physics Refinement module that improves this data by applying physics rules. The Motion Generator provides a large amount of initial motion data, which is then refined to ensure it looks realistic and follows physical constraints. This process allows Morph to create high-quality motions while avoiding the need for expensive real-world motion capture data.
Why it matters?
This research is significant because it enhances the realism of animated characters and robots, making them move more naturally. By improving how we generate human motion without needing real-world data, Morph can be used in various applications like video games, movies, and robotics, ultimately leading to more lifelike digital interactions.
Abstract
Human motion generation plays a vital role in applications such as digital humans and humanoid robot control. However, most existing approaches disregard physics constraints, leading to the frequent production of physically implausible motions with pronounced artifacts such as floating and foot sliding. In this paper, we propose Morph, a Motion-free physics optimization framework, comprising a Motion Generator and a Motion Physics Refinement module, for enhancing physical plausibility without relying on costly real-world motion data. Specifically, the Motion Generator is responsible for providing large-scale synthetic motion data, while the Motion Physics Refinement Module utilizes these synthetic data to train a motion imitator within a physics simulator, enforcing physical constraints to project the noisy motions into a physically-plausible space. These physically refined motions, in turn, are used to fine-tune the Motion Generator, further enhancing its capability. Experiments on both text-to-motion and music-to-dance generation tasks demonstrate that our framework achieves state-of-the-art motion generation quality while improving physical plausibility drastically.