Harvard students expose how meta glasses can be transformed into AI-powered surveillance tool

Two Harvard students turned Meta’s smart glasses into a powerful artificial intelligence (AI)-powered surveillance tool to demonstrate how the line between innovation and privacy invasion has become increasingly blurred. Their I-XRAY project exposes the unsettling potential for misuse of everyday technology. However, they also suggested ways to prevent this.

Two Harvard students have demonstrated how Meta’s smart glasses, combined with facial recognition technology, can instantly reveal personal information. Using widely available technology like Ray-Ban Meta smart glasses and public databases, AnhPhu Nguyen and Caine Ardayfio have developed a system that can dox individuals, revealing their names, phone numbers, and addresses in real time.

The demo, dubbed I-XRAY, works by live streaming video from the smart glasses to a computer program. This program uses AI to identify faces in the footage and then searches public databases for corresponding information. The results are then displayed on a connected phone app.

In a disturbing demonstration, the students were able to identify classmates, their addresses, and even relatives. More alarmingly, they approached strangers on public transportation, pretending to know them based on the information gathered. This highlights the potential for misuse and the serious privacy implications of such technology.

The implications of this technology have sparked significant concerns about privacy and doxing. Doxing involves the malicious practice of searching for and publicly disclosing private personal information about an individual or organisation.

In their documentation, the two students acknowledged that their initial ‘side project’ quickly evolved into a tool with serious privacy implications. They emphasised that their intention was not to create a harmful tool but to demonstrate current technology’s potential dangers.

By showcasing how easily personal details can be extracted from a person’s face in public, they aimed to raise awareness about the urgent need for stronger privacy protections.
The two students also provided resources to help users manage their online presence and reduce the risk of identity theft. One key step is removing your face from reverse face search engines. Platforms like Pimeyes and Facecheck.id allow users to upload a photo and find similar faces across the web. Fortunately, both offer free opt-out services to remove your images from their searches, though Facecheck.id may require verification through a government ID.

Previous articleGodrej & Boyce’s Security biz expects policy boost to help growth, share of home segment to rise
Next articleIATA launches SeMS Certification to enhance aviation security risk management