robot

LivePortrait: A Facial Expression Animation Transfer Project by Kuaishou

The technology of transferring facial expressions to images to generate videos exhibits astonishing performance, with excellent control in this field. It feels very helpful for character performance and digital human creation in AI video generation. It supports various styles of images and allows for fine-tuning of facial movement, as well as common animal facial transfers.

The technology of transferring facial expressions to images to generate videos has shown remarkable performance, with control effects that are truly outstanding. This technology is of great assistance in shaping character performances and digital humans in AI video generation. It can transfer vivid facial expressions from one source to another image, thereby generating expressive video content.

The technology supports various styles of images, whether realistic, cartoonish, or abstract, and can effectively transfer facial expressions. Users can also fine-tune the facial movement range according to their needs, achieving more precise control. In addition, the technology is not limited to human faces but also supports common animal facial transfers, bringing more possibilities for creation.

To further enhance the generation quality and generalization ability, the research team has made efforts in various aspects. They expanded the training data to about 69 million high-quality frames, providing the model with richer learning materials. At the same time, they adopted a mixed training strategy of images and videos, allowing the model to better learn the characteristics of different types of data. In addition, they upgraded the network architecture to make it more efficient and powerful. They also designed better action transformations and optimization objectives to improve the quality and realism of the generated videos.

In terms of technical implementation, the research team found that compact implicit key points can effectively represent a hybrid shape. Based on this discovery, they carefully designed a stitching module and two retargeting modules. These modules use a small MLP, greatly enhancing controllability without adding much computational overhead. Whether for fine-tuning facial expressions or controlling overall movements, these modules play an important role.

Currently, the ComfyUI plugin, named ComfyUI - LivePortraitKJ, has been launched. This plugin provides users with a more convenient way to use, allowing users to easily apply facial expression transfer technology for video creation. Both professional video producers and ordinary enthusiasts can enjoy the creative joy and convenience brought by this advanced technology through this plugin.