Home/Latest/Wearables/Smart Glasses Are Watching: What the Meta Ray-Ban Contracto…
189
FeatureSmart Glasses Are Wat…
FiledMay 1 · 2026
Read5 min read
Bylineomer-yld
AnalysisWearables·5 min read·May 1, 2026

Smart Glasses Are Watching: What the Meta Ray-Ban Contractor Scandal Means for You

Meta cut more than 1,000 Kenyan contractors after they reported reviewing intimate Ray-Ban Meta footage. A class-action lawsuit followed, and Kenya opened a regulatory probe. Here's what the scandal exposes about every camera-equipped wearable on your face.

The Technerdo Weekly

Analysis worth reading, delivered every Monday.

One carefully written email a week. Features, deep dives, and the stories buried under press-release noise. No daily clutter.

One email a week · Unsubscribe any time · No affiliate-only promos
Tech·Nerdo

Independent tech reviews, comparisons, guides, and the best deals worth your time. Built for nerds, by nerds.

Sections

LatestReviewsGuidesComparisonsDeals
OY
Omer YLD
Founder & Editor-in-Chief
May 1, 20265 min read
A pair of folded matte-black eyeglasses with gold hinges rests on a closed leather notebook beside a small brass coin in warm rim light.Photo: Technerdo
Above → A pair of folded matte-black eyeglasses with gold hinges rests on a closed leather notebook beside a small brass coin in warm rim light.
Photo: Technerdo

Smart glasses sold a quiet, comfortable lie: that what their cameras capture stays on the device. The Meta Ray-Ban contractor scandal — which broke in February, escalated to a class-action lawsuit in March, and saw Meta terminate its Sama contract — affecting more than 1,000 workers — by late April — has burned the lie down. If you wear a camera on your face, somebody you've never met is plausibly watching the footage to teach the AI behind it.

The Briefing3Smart glasses privacy · April 2026

What we know

  • 1,000+ workers terminated — Meta ended its contract with Sama in Kenya after the scandal broke in Swedish and Kenyan press in February.
  • Class-action lawsuit filed — plaintiffs Bartone and Canu sued Meta in March 2026 over undisclosed offshore video review.
  • Kenya regulatory probe — the Office of the Data Protection Commissioner opened a high-priority investigation in April.

What actually happened

In February 2026, Swedish publications Svenska Dagbladet and Göteborgs-Posten published a joint investigation that traced Ray-Ban Meta footage from the wearer's face to a labelling pipeline run by Sama, a Kenya-based outsourcing firm. Workers told reporters they had seen videos of people changing clothes, using bathrooms, and in some cases having sex — captured by Ray-Ban Meta users who, the workers believed, did not realise the footage was being uploaded for human review.

The story moved quickly. In early March, plaintiffs Gina Bartone and Mateo Canu filed a class-action against Meta Platforms, alleging the company had failed to disclose that captured video was transmitted off-device to a Kenyan subcontractor. Kenya's Office of the Data Protection Commissioner opened a in April. Late in the month, that Meta had terminated the Sama contract; . Meta's framing was that Sama's workers "didn't meet our standards." Sama's response was that Meta never raised a specific performance issue and that the workers had followed the security and operational protocols Meta provided.

Filed underSmart GlassesMetaRay BanPrivacyWearables2026
OY
About the writer

Omer YLD

Founder & Editor-in-Chief

Omer YLD is the founder and editor-in-chief of Technerdo. A software engineer turned tech journalist, he has spent more than a decade building web platforms and dissecting the gadgets, AI tools, and developer workflows that shape modern work. At Technerdo he leads editorial direction, hands-on product testing, and long-form reviews — with a bias toward clear writing, honest verdicts, and tech that earns its place on your desk.

  • Product Reviews
  • AI Tools & Developer Workflows
  • Laptops & Workstations
  • Smart Home
  • Web Development
  • Consumer Tech Analysis
All posts →Website
Was this piece worth your five minutes?

Join the conversation — sign in to leave a comment and engage with other readers.

Sign InCreate Account

Loading comments...

Share

Topics

AISmartphonesLaptopsSmart HomeCybersecurity

About

AboutContactPrivacyTermsAffiliate disclosure
© 2026 Technerdo Media · Built for nerds, by nerds.
· Since 2016 ·
formal investigation
Ars Technica reported
IBTimes UK pegged the impact at more than 1,000 workers

Why this is structurally different from a webcam scare

Tech privacy stories follow a familiar arc: a company collected more than they said, users get angry, the company tightens controls, the headline fades. The Ray-Ban Meta version is different in three respects.

Continuous, opt-in-by-default capture. Unlike a phone camera you point and shoot, smart glasses are by design hands-free and unobtrusive. The hardware works best when the wearer forgets it's recording. That same property makes consent — yours and the bystanders around you — structurally impossible to verify.

The training-data pipeline is the product. Modern multimodal AI is data-hungry; Meta's pitch for the next generation of glasses depends on continuous learning from real wearer footage. Pulling video to human reviewers isn't a bug, it's the loop. The Sama-to-Meta termination doesn't end the loop; it relocates it.

You are not the only person being recorded. Phone-camera privacy law has had decades to calibrate to the reasonable expectations of bystanders. There is no equivalent calibration for AI glasses worn in changing rooms, waiting rooms, or anywhere a person undressed near a wearer.

What this means for buyers right now

We've covered Ray-Ban Meta's appeal as a hardware product before — the glasses are genuinely good, and the integration with Meta's models is the best in the category. The privacy posture changes the buying calculus more than the hardware does. Three concrete recommendations:

Heads up

Assume any Ray-Ban Meta footage you have ever uploaded was reviewable by a human reviewer at some point. Meta's contract change does not reach back into already-collected training data. If you've worn the glasses indoors with other people present, that data is in the pipeline.

  1. Disable cloud sync if you keep wearing them. Meta AI's "improve our services" toggle is the consumer-side opt-out. It's buried — Settings → AI → Improve AI features — and defaults to on. Off means your captures stay local; the cost is most generative features stop working.
  2. Don't wear them in spaces where bystanders haven't consented. This is the part of smart-glasses etiquette nobody publishes. Bathrooms, locker rooms, intimate moments are out of bounds even with the recording light on, because the recording light is barely visible.
  3. Watch the EU and Kenyan rulings. Both jurisdictions are likely to issue binding guidance before US regulators do. If you're in either, expect mandatory disclosure language and possibly a hardware-side recording indicator change.

For a broader view of how the industry is treating consent and AI training data this year, our state of cybersecurity in 2026 piece holds up — the supply-chain pattern here is the same as everywhere else in 2026's AI stack: data flows further than the consumer sees, and the visibility isn't catching up.

What to watch next

  • The class-action discovery phase. If plaintiffs can pull the full data path — wearer to S3 to Sama labellers — into the public record, every other AI hardware company will get a corresponding subpoena.
  • Meta's replacement contractor. If the work moved to a different country and a different vendor, the structural problem moved with it. Watch for who Meta names.
  • Apple's smart glasses launch. Apple's AirPods Ultra and rumoured glasses with cameras are next year's competitive threat. Apple's privacy framing depends on credible on-device processing — the Ray-Ban scandal is exactly the contrast Apple wants to draw.
  • Hardware-side LED standards. EU regulators have signalled interest in mandating a recording-active LED that can't be defeated. If that lands, Meta has to redesign the frames.

The lesson is structural, not specific to Meta. Any device that captures continuous video and routes it through a model — phones with always-on cameras, doorbells with AI summaries, fridges with interior cameras — will eventually need a human-review pipeline to keep the AI honest. The Ray-Ban Meta scandal is the first time the consumer side of that pipeline became visible. It will not be the last.

If you are going to wear cameras on your face, wear them like cameras.

— ∎ —
Tech·Nerdo
LatestReviewsGuidesComparisonsDeals
Search⌘K
Est. 2026 · 201 stories in printWearables · Smart Glasses Are Watching: What the Me…