Two Harvard Students Show How Ray-Ban Meta Smart Glasses Could Be Used to Instantly Dox People

As a seasoned privacy advocate who has witnessed the digital landscape evolve over the past few decades, I find myself increasingly alarmed by advancements such as the “I-XRAY” project demonstrated by AnhPhu Nguyen and Caine Ardayfio. The ability to expose a person’s personal data with a mere glance through smart glasses integrated with facial recognition software is not only unsettling but also a stark reminder of the challenges that lie ahead in preserving our privacy in an increasingly connected world.


A report by Ashley Belanger for Ars Technica has disclosed that two students at Harvard have demonstrated the potential issue of linking Meta’s smart glasses with facial recognition software. In just seconds, this combination could potentially reveal a person’s identity and private information.

As a crypto investor, I recently came across a fascinating development: AnhPhu Nguyen and Caine Ardayfio have ingeniously adapted a pair of Ray-Ban Meta Smart Glasses, integrating them with PimEyes, a cutting-edge reverse image facial recognition engine, and a large language model (LLM). This innovative system, known as “I-XRAY,” has the chilling capability to instantly scrape personal data such as names, phone numbers, and addresses from the web, merely by observing individuals in public spaces through these glasses. Nguyen’s explanation of this technology underscores grave privacy concerns, given its potential to identify strangers based on their physical appearance.

Are we prepared for a future where information about us is easily accessible? Here’s a suggestion on how to safeguard your data: @CaineArdayfio and I provide guidance on protecting yourself.

— AnhPhu Nguyen (@AnhPhuNguyen1) September 30, 2024

The students carried out experiments at a subway station, swiftly gathering data about commuters’ faces and retrieving publicly accessible information through online people-search platforms. Some passersby were deceived into thinking the students recognized them due to the rapid details they pulled up. They referred to their project as a showcase of how quickly such technology could be misused for harmful intentions. “A man could effortlessly discover a woman’s home address on the train and track her down,” Nguyen cautioned.

1 – I-XRAY leverages recent breakthroughs in Large Language Models and facial recognition technology, enabling automatic data extraction that would once demand considerable time and labor. For this project, Meta’s Ray-Ban glasses with transparent lenses were selected due to their unobtrusive design, making them appear like regular eyewear. To ensure the scanning process went undetected, the students disabled the glasses’ recording light, emphasizing the potential risks involved.

Although Nguyen and Ardayfio made a significant advancement with I-XRAY, they clarified that they had no plans to disclose its source code. Instead, their objective was to highlight the escalating privacy concerns. They urged people to avoid using intrusive search engines like PimEyes in order to safeguard their personal data.

In the European Union, privacy regulations necessitate consent before gathering facial recognition data, whereas no such safeguards exist in the U.S., making it a potential target for misuse by unscrupulous individuals. However, it’s important to note that the students’ project is not singular—similar technologies are being developed worldwide. For example, Clearview AI, a firm specializing in facial recognition for law enforcement, has allegedly considered implementing smart glasses capable of scanning faces. This development has sparked considerable worry due to Clearview’s contentious methods and their ambition to incorporate almost every human face into their database.

As a researcher delving into this topic, I’ve come across the guidelines by Nguyen and Ardayfio on how to strip personal information from facial recognition engines such as PimEyes and Facecheck ID, and people-search databases including FastPeopleSearch, CheckThem, and Instant Checkmate. However, their findings reveal that even opting out doesn’t necessarily secure anonymity, as some individuals are still effortlessly identifiable. Despite their cautionary notes, it’s unsettling to consider that technologies like I-XRAY could potentially become accessible to anyone equipped with the right resources, posing a significant privacy concern in the near future.

Read More

2024-10-03 12:17