EgoPrivacy: What Your First-Person Camera Says About You?
Yijiang Li, Genpei Zhang, Jiacheng Cheng, Yi Li, Xiaojun Shan, Dashan Gao, Jiancheng Lyu, Yuan Li, Ning Bi, Nuno Vasconcelos
2025-06-17
Summary
This paper talks about EgoPrivacy, a study and benchmark that looks into the privacy risks connected to first-person, or egocentric, videos taken by wearable cameras. It shows that even though these videos don’t show the wearer's face directly, AI models can still figure out a lot of private information about the person wearing the camera just by analyzing the video content, like their identity, age, or where they are.
What's the problem?
The problem is that wearable cameras capture personal and continuous views of the person’s life, which seems private because the wearer’s face might not be visible. However, this gives a false sense of security because there is still a lot of private information hidden in the video that can be revealed. AI models can extract details like the wearer’s gender, race, exact identity, or location without needing any extra training, which creates serious privacy concerns.
What's the solution?
The solution was to create EgoPrivacy, a large-scale benchmark that measures how much private information can be extracted from egocentric videos by AI models. The researchers tested different types of AI attacks, including one called Retrieval-Augmented Attack that combines first-person videos with third-person footage from other sources to identify wearers more accurately. Their tests showed that even AI models used without any extra training can successfully guess private details with high accuracy, highlighting the need for better privacy protections.
Why it matters?
This matters because as more people use wearable cameras, it’s important to understand the risks to their privacy. Knowing that AI can reveal personal information from these videos helps push for stronger privacy measures to protect users. It also raises awareness about how seemingly private data can be exposed, which is crucial for developing safer technologies and trustworthy wearable devices.
Abstract
EgoPrivacy evaluates privacy risks in egocentric vision through a large-scale benchmark, revealing that foundation models can infer private information about camera wearers with high accuracy in zero-shot settings.