Meta’s AI-powered Ray-Ban smart glasses are reportedly collecting sensitive user data, including intimate footage, which is then reviewed by workers in Nairobi, Kenya, according to a Swedish newspaper.
The glasses, which allow users to activate an AI assistant with voice commands, process images, and record short videos, rely on a subcontractor, Sama, to train AI systems.
Data annotators at Sama label images, transcribe audio, and evaluate AI responses to user interactions.
“In some videos, you can see someone going to the toilet or getting undressed. I don’t think they know, because if they knew they wouldn’t be recording,” one worker told Svenska Dagbladet.
“We see everything – from living rooms to naked bodies. Meta has that type of content in its databases. People can record themselves in the wrong way and not even know what they are recording. They are real people like you and me,” another worker said.
Some footage reportedly includes people having sex. Workers also said personal devices are banned in the office to prevent leaks.
Meta’s terms of service state that "in some cases, Meta will review your interactions with AIs, including the content of your conversations with or messages to AIs, and this review can be automated or manual (human)".
Former Meta employees said faces appearing in annotation data are automatically blurred. Kenyan workers, however, claim the anonymisation tools do not always work.
"The algorithms sometimes miss. Especially in difficult lighting conditions, certain faces and bodies become visible," a former Meta worker said.
Data annotators recalled feeling uncomfortable reviewing the footage but said they continue working due to economic necessity.
"You understand that it is someone’s private life you are looking at, but at the same time you are just expected to carry out the work. You are not supposed to question it. If you start asking questions, you are gone," they said.
Aakash Gupta, a technology commentator, explained on X that every interaction with the glasses is routed through Meta’s servers to Sama in Nairobi, where workers manually label objects in the videos.
"Meta markets these glasses as “designed with your privacy in mind.”
The privacy design is a tiny LED light on the frame that most people don’t notice.
The data pipeline behind it routes your bedroom footage to a contractor (Sama) with a documented history of worker exploitation, failed anonymisation, and union-busting lawsuits.
“And the next generation of these glasses? Meta is planning to add facial recognition. The same system that can’t reliably blur faces in training data wants to start identifying them on purpose.
The LED light on the frame is doing about as much for your privacy as the terms of service nobody reads." he wrote.





