Motion2Motion is a no-training motion migration framework that focuses on solving the animation migration problem between characters with different skeletal structures. Traditional animation retargeting techniques usually require that the source and target characters have similar skeletal topologies, which is difficult when dealing with characters that are vastly different (e.g., humans and quadrupeds), but Motion2Motion proposes an innovative approach that eliminates the need for large-scale paired motion datasets for training, and instead only requires the user to provide a small number of sample motions of the target character and specify sparse correspondences (at least 6) between the source and target skeletons to achieve high-quality motion migration. sparse correspondences between the source and target skeletons (at least as few as 6), high-quality motion migration can be achieved. The framework maps the action segments of the source character to the action space of the target character through a novel motion matching approach, and eventually reconstructs a natural and coherent animation, which is very convenient for creators in the fields of game development, film and television animation, and so on.
Function List
- Cross-Topology Action Redirection: Support for migrating movements between characters with vastly different skeletal structures, e.g., migrating a snake's attacking movements to a Tyrannosaurus Rex.
- Bipedal and quadrupedal animation redirection: It is possible to realize motion transformation between characters with different motion patterns with high quality, e.g. migrating the walking motion of a flamingo to a monkey, and generating natural limb, arm and tail dynamics.
- Sparse Bone Correspondence MatchingThe user does not need to tediously bind all the bones one by one, but only needs to specify a small number (at least 6) of key bone correspondences for the system to complete the motion migration.
- Support for sparse source keyframes: Even if the input source animation is sparse with discontinuous keyframes, the system is able to successfully reconstruct a complete and coherent target animation.
- Blender Plugin Integration: The Blender plug-in is provided to facilitate real-time, visual motion redirection operations for animators in a professional workflow.
- Compatible with SMPL models: Easily migrate movements captured based on standard mannequins such as SMPL to game characters with complex costumes and dynamic effects.
Using Help
At its core, Motion2Motion is a no-training required motion migration framework, meaning that users do not need to prepare large-scale datasets or perform complex model training to use it directly. It is delivered to end-users primarily through a Blender plug-in, enabling seamless integration in professional animation production environments.
Core workflow
According to the system overview on its project homepage, the Motion2Motion workflow can be understood in the following steps:
- Preparing to enter data:
- Source Motion: A sequence of animation that you wish to migrate, such as a captured animation of a human dancing.
- Target Skeleton: The skeleton of the character model to which you wish to apply the animation, such as a monster character in a game.
- Target Motion Database: Prepare one or a few exemplary action clips for the target character. This library does not need to be large and is used for systematic understanding of the target character's movement characteristics.
- Define a Sparse Correspondence.:
This is the most crucial step in the whole process. The user needs to create associations between the bones of the source and target characters. Unlike traditional methods, Motion2Motion does not require one-to-one binding of every bone.- In the Blender plugin's interface, you only need to select a few key parts of the two character's skeletons that have similar kinematics to match. For example, associate the source character's "head" with the target character's "head", the "left hand" with the target's "left front paw", the "spine" with the target's "spine", and the "spine" with the target's "spine". "left hand with the target's left forepaw, the spine with the target's spine, etc.
- According to the official demo, a minimum of only 6 pairs of key bones such as head, spine, hips, left and right feet, and left hand need to be bound to get the desired effect. This sparse mapping greatly reduces the complexity of the pre-setup.
- Execution Action Migration:
- After the setup is complete, start the migration process in the plugin. The system will automatically execute the operation in the background.
- Movement Breakdown and MatchingThe system first breaks down the source motion sequence into many tiny, overlapping "Motion Patches".
- Spatial projection and retrieval: Next, it will "project" each source action fragment into the target character's bone space using the sparse bone mapping relationship you defined, forming a "query request". Then, the system will use this query request to search the target character's sample action library you provided to find the most matching action fragment.
- Integration and reconstruction: The system performs weighted averaging and fusion of all retrieved target motion clips that best match the target character, and finally reconstructs a complete, smooth animation that matches the physical structure of the target character based on the fusion results.
Operations in the Blender plugin
The official video demonstrates the real-time interface of its Blender plug-in, and the use process is very intuitive for animators:
- Importing models and animations: Import your source character (and its actions) and target character in a Blender scene.
- Open the Motion2Motion plugin: After launching the plug-in, a specialized control panel will appear.
- Assigning roles and actions: In the panel, assign the source character and the target character to the corresponding fields respectively. Load the action data of the source character.
- Creating Bone Correspondence: The plugin will provide a visual interface that allows you to easily select source and target bones and create pairing relationships. You can see in real time which bones are bound (e.g. with highlighted colors).
- Real-time redirection: After completing the binding, the real-time redirection function is activated. When you play the source character's animation, the target character will move according to the algorithm result in real time, which is convenient for you to preview and adjust immediately.
- Export results: Once you are satisfied with the results, you can bake the generated animation data to the target character's skeleton for use in a game engine or other 3D software.
For developers wishing to delve deeper or do secondary development, you can visit the project's officialCode
link to get the source code and follow its documentation for environment configuration and usage.
application scenario
- game development
In game development, it is often necessary to animate characters with different forms (e.g. monsters, aliens, animals). With Motion2Motion, development teams can quickly generate high-quality animations for non-human characters in games using the same set of motion capture data for human actors, dramatically saving the cost and time of manually animating each unique character. - animation for film and television
In animated film or special effects production, animators can reuse existing libraries of high-quality animation. For example, take the running animation of a horse that already exists in an archive and migrate it to a fictional fantasy creature, while still maintaining the realism of the movement and the uniqueness of the character. - Virtual Reality and the Metaverse
In VR applications or meta-universe platforms, the user's Avatar varies greatly, and Motion2Motion can apply the user's body movements (captured by VR trackers) in real-time to any form of Avatar, whether it is a standard humanoid or a cartoonish animal, to ensure that the movements are naturally coordinated. - Research in Robotics
Researchers can use the technology to migrate human or animal movement patterns to robots for motion planning and simulation testing of robots, accelerating the development process of bionic robots.
QA
- What is Motion2Motion?
Motion2Motion is a computer graphics research project published in SIGGRAPH Asia 2025. It proposes a novel, training-free framework for animation migration between characters with vastly different skeletal structures. - Does this tool require a lot of training data?
Motion2Motion is a "training-free" framework. Motion2Motion is a "training-free" framework that doesn't rely on large datasets of paired actions. All you need is one or a few simple sample actions for the target character. - How should I use Motion2Motion?
For animators or designers, the most direct way to use it is through its officially provided Blender plugin. You can import your model in Blender, set up sparse bone correspondences through the plugin interface, and then you can preview and generate the redirected animation in real time. For developers, you can get the source code from its GitHub page for deeper integration and development. - What types of character animation migrations does it support?
The framework is very flexible and supports a wide range of complex migration scenarios, including:
- intraspecific migration: e.g. migration from one type of snake to another.
- cross-species migration: e.g. migration from snakes to Tyrannosaurus rex.
- Bipedal to quadrupedal migration: e.g. migration from humans or birds to quadrupeds such as monkeys, bears or dogs.
- Standard Model to Game Character: e.g. migrating the movements of an SMPL mannequin to a complex fantasy game character.