UniPortrait: A Unified Framework for Identity-Preserving Single- and Multi-Human Image Personalization
Junjie He, Yifeng Geng, Liefeng Bo
2024-08-13

Summary
This paper presents UniPortrait, a new framework that allows for the easy customization of human images, whether it's for one person or multiple people, while maintaining high-quality and editable facial features.
What's the problem?
Creating personalized images of people can be complicated, especially when you want to customize images for more than one person at a time. Existing methods often struggle to maintain the quality and details of the faces, making it hard to achieve realistic results.
What's the solution?
UniPortrait simplifies this process by using two main components: an ID embedding module that captures important facial features from images and an ID routing module that organizes these features into the right places in the final image. This framework allows for both single and multiple person customizations with high fidelity and flexibility, meaning users can easily edit and change facial attributes. The method was trained in two stages to ensure it performs well in various scenarios.
Why it matters?
This research is significant because it makes it easier to create personalized images for different applications, such as social media, gaming, or virtual reality. By providing a tool that can handle both single and multi-person customization effectively, it opens up new opportunities for creative expression and interaction in digital spaces.
Abstract
This paper presents UniPortrait, an innovative human image personalization framework that unifies single- and multi-ID customization with high face fidelity, extensive facial editability, free-form input description, and diverse layout generation. UniPortrait consists of only two plug-and-play modules: an ID embedding module and an ID routing module. The ID embedding module extracts versatile editable facial features with a decoupling strategy for each ID and embeds them into the context space of diffusion models. The ID routing module then combines and distributes these embeddings adaptively to their respective regions within the synthesized image, achieving the customization of single and multiple IDs. With a carefully designed two-stage training scheme, UniPortrait achieves superior performance in both single- and multi-ID customization. Quantitative and qualitative experiments demonstrate the advantages of our method over existing approaches as well as its good scalability, e.g., the universal compatibility with existing generative control tools. The project page is at https://aigcdesigngroup.github.io/UniPortrait-Page/ .