< Explain other AI papers

MyTimeMachine: Personalized Facial Age Transformation

Luchao Qi, Jiaye Wu, Bang Gong, Annie N. Wang, David W. Jacobs, Roni Sengupta

2024-11-25

MyTimeMachine: Personalized Facial Age Transformation

Summary

This paper discusses MyTimeMachine, a new technology that allows users to transform their facial images to show how they might look at different ages, using only a small number of personal photos.

What's the problem?

Creating realistic images that show how a person ages can be difficult because traditional methods often require a lot of data and do not accurately reflect an individual's unique aging process. Existing techniques may produce images that look plausible but do not truly resemble how the person would look at a specific age. This makes it hard to personalize age transformations effectively.

What's the solution?

MyTimeMachine combines a general understanding of how people age (global aging prior) with personal photo collections (as few as 50 images) to create accurate age transformations. It uses an innovative Adapter Network that merges general aging features with personalized ones to generate images that reflect the target age while preserving the person's identity. The system also includes special training techniques to ensure high-quality results and can even be applied to videos for consistent aging effects.

Why it matters?

This research is significant because it offers a practical and efficient way to visualize aging for individuals, which can be useful in various fields such as entertainment (like movies and video games), personal use (like seeing how loved ones might age), and even in medical applications. By making it easier to create realistic aging transformations, MyTimeMachine enhances our ability to understand and visualize the aging process.

Abstract

Facial aging is a complex process, highly dependent on multiple factors like gender, ethnicity, lifestyle, etc., making it extremely challenging to learn a global aging prior to predict aging for any individual accurately. Existing techniques often produce realistic and plausible aging results, but the re-aged images often do not resemble the person's appearance at the target age and thus need personalization. In many practical applications of virtual aging, e.g. VFX in movies and TV shows, access to a personal photo collection of the user depicting aging in a small time interval (20sim40 years) is often available. However, naive attempts to personalize global aging techniques on personal photo collections often fail. Thus, we propose MyTimeMachine (MyTM), which combines a global aging prior with a personal photo collection (using as few as 50 images) to learn a personalized age transformation. We introduce a novel Adapter Network that combines personalized aging features with global aging features and generates a re-aged image with StyleGAN2. We also introduce three loss functions to personalize the Adapter Network with personalized aging loss, extrapolation regularization, and adaptive w-norm regularization. Our approach can also be extended to videos, achieving high-quality, identity-preserving, and temporally consistent aging effects that resemble actual appearances at target ages, demonstrating its superiority over state-of-the-art approaches.