I do think it's completely unacceptable if Meta makes the glasses unable to be used for routine functions without (a) other humans reviewing your private content and (b) AI training on your content. There needs to be total transparency to people when this is happening - these are absolutes.
But I'm a bit confused by the article because it describes things that seem really unlikely given how the glasses work. They shine a bright light whenever recording. Are people really going into bathrooms, having sex, sharing rooms with people undressed while this light is on? Or is this deliberate tampering, malfunctioning, or Meta capturing footage without activating the light (hard to believe even Meta would do this intentionally).
show comments
chwahoo
I'll confess that I like my Meta Ray Ban glasses: I love using them to listen to podcasts at the pool/beach, while riding my bike, and it's cool to snap a quick picture of my kids without pulling out my phone.
I wish this article (or Meta) were a bit clearer about the specific connection between the device settings and use and when humans get access to the images.
My settings are:
- [OFF] "Share additional data" - Share data about your Meta devices to help improve Meta products.
- [OFF] "Cloud media" - Allow your photos and videos to be sent to Meta's cloud for processing and temporary storage.
I'm not sure whether my settings would prevent my media from being used as described in the article.
Also, it's not clear which data is being used for training:
- random photos / videos taken
- only use of "Meta AI" (e.g., "Hey Meta, can you translate this sign")
As much as I've liked my Meta Ray Ban's I'm going to need clarity here before I continue using them.
TBH, if it were only use of Meta AI, I'd "get it" but probably turn that feature off (I barely use it as-is).
show comments
jspdown
Don't you need to obtain consent before filming random people in the street? I already feel uncomfortable when someone takes a photo in public and I happen to be in it, but this type of device takes things to an entirely different level. With smart glasses, there's no visible cue that you're being recorded. No phone held up, no camera in sight. I'm questioning the legality of this in Europe, where privacy laws tend to be stricter. In the meantime, should I just assume that anyone wearing these glasses is always filming? And would I be within my rights to ask them to stop the moment I notice them?
show comments
notyetmachine
Ghanaian authorities are seeking the arrest of a Russsian national who was using glasses to record himself picking up, and sleeping with, women in Ghana and Kenya. He uploaded them to social media and telegram. Was quite the story on African tech twitter last month.
Everyone should assume that _anything_ connected to internet will get uploaded to internet and someone within the company will have permission to review the contents regardless of what the policy says.
1. Debugging for troubleshooting.
2. Analytical for making product better.
3. Bugs that collects your info when it shouldn't.
4. Bugs from 3rd party vendor if company uses those.
5. Insecure process. Getting access to a private content within the company is trivial due to coarse permission model.
Source: I worked at two well known social media companies. Trust & Safety and data infra teams
blakesterz
Meta aims to introduce facial recognition to its smart glasses while its biggest critics are distracted, according to a report from The New York Times. In an internal document reviewed by The Times, Meta says it will launch the feature “during a dynamic political environment where many civil society groups that we would expect to attack us would have their resources focused on other concerns.”
I was in engineering school back in ~2012 when Google Glass came out. One of my classmates got hold of a pair when they were still quite uncommon and wore them to an extracurricular club meeting. Within minutes someone made a comment about him wearing the "creeper" glasses and asked if he was filming. He never wore them to the club again.
I just don't see a world where that doesn't happen with Meta glasses.
show comments
mayowaxcvi
My concern was whether the glasses might record or transmit data while switched off or in standby mode.
From what I can tell, they don’t do this intentionally. So the risk is broadly similar to other modern electronic devices.
The creepiness concern is real, but I think people misplace where the actual surveillance happens. The most consequential stores of personal data aren’t ad networks they’re things like banks, hospitals, insurers, and telecoms. These institutions hold information about your health, finances, movements, and relationships, indexed and searchable by employees you’ve never met, governed by policies you’ve never read.
Realistically, there’s very little an individual can do to completely opt out.
My take is: if the main outcomes are that I get shown ads for things I don’t need and my facecomputer knows the difference between a fork and a spoon… I… I can live with that.
show comments
bhekanik
As a dev, I think the core issue isn’t whether one indicator LED can be bypassed — motivated people can bypass almost any client-side control. The trust boundary is policy + defaults. If enabling “AI features” implicitly authorizes broad retention/review, users won’t understand the tradeoff until after the fact.
A better pattern would be tiered modes with explicit UX: local-only capture, cloud processing without retention, and opt-in retention/training with visible status. If the product can’t technically support that separation today, that limitation should be stated plainly in setup, not buried in policy docs.
_ZeD_
Sooo... I really should start keepin running this[1] all the time...
I do not care about the privacy of people who buy these glasses nor their families.
I care about the innocent people whose privacy is invaded by people who buy these glasses.
show comments
bogzz
I am so far removed from the type of person who might consider buying something like that. You'd have to be exceptionally impervious to social cues to even think of wearing that in public.
If you're blind, it's of course understandable but that's pretty much it in terms of cases in which I would consider the glasses acceptable to wear in public.
show comments
NalNezumi
I sincerely hope someone in Japan or Korea get caught using those to peek under trousers on the train so it get the forced camera sound treatment of smartphones over there.
So the world can label them as Hentai glasses and move on
show comments
ccccrrriis
I got a pair as a gift and didn't look much into them but I have to be honest, I assumed any data I captured - voice, video, etc. - would be sent to their servers (to use their models) and they'd be using it for training with humans in the loop.
Tbh the only thing I really use the glasses for are listening to music or talking on the phone - so basically how you'd use airpods. I don't use airpods because I had an ear injury that prevents me from using them on my left ear, so these glasses were kinda nice for that. I really wish they didn't have a camera though because I do always feel compelled to remove them if I interact with people.
I also have to add that the quality is mediocre. They're a month old and the case has problems charging sometimes, and one of the screws is always coming loose at a hinge no matter how often I retighten that side.
show comments
majestik
Is anyone here actually surprised Meta is recording and reviewing their content?
Vote with your dollars people.
show comments
gverrilla
Usage of creep-ware won't be tolerated in the social groups I take part on.
We will shame hard anyone who uses this sh1t.
show comments
dhab
Conditioning the crowd gradually towards being monetised in some of the most egregious ways - first pay for the glasses, then pay with revelations of private life sold to government (ICE?), business (private insurance) and so on. Super evil.
And despite this, there is no strong will to detach from what they produce - in the beginning or later when it is considered like cultural fabric. That’s how good their tactics is.
And for the pay one gets working for them - screw the world! I won’t use it anywhere near my loved ones - but will build it
andy_ppp
We need safe spaces where you aren't constantly living inside the panopticon...
shevy-java
Well - don't wear their spyglasses. It's really not that hard.
You can still record stuff without spyglasses. People do that
on youtube too, e. g. first amendment audits. It's not that
different to the spyglasses, except that you can cut off Meta
from the process (admittedly youtube creates another problem
which is called Google; it would be nice if we could have
platforms without corporate overlord, but the financial aspect
may still be an issue that requires solving. I don't have a
good way to solve that, as I am also having a 100% zero ads
policy aka using ublock origin mandatorily. And Google declared
total war againts ublock origin, we all know that.)
binarynate
At a friend's party recently, I met someone who told me that they had worked in data for Meta's glasses division and warned me never to get Meta glasses for this very reason—that the workers can see everything. They told me of a comical case where a guy pulled down his pants to look at his penis, asked "Meta, what is this?", and the AI responded that it was a thumb. XD
show comments
greatgib
Privacy policies and usage terms are like the magic wand of the industry. Whatever totally bad they want to do and however they want to abuse of you and of your data, they just have to add a few unreadable lines in a 40 pages document and that's it.
No one will read it, but even if you do, most of the time the FOMO or sunk cost fallacy effect will make you go on anyway.
And then it is a free pass for them.
halapro
To a technical person, this is obvious. AI doesn't happen on the glasses, it doesn't happen on your crappy phone, it happens online. Live streaming, which is also a feature, by definition sends everything it captures to someone else's computer (ahem, the cloud).
Yesterday I saw a Instagram reel of a guy asking "what am I looking at" while between his girlfriend's legs. Congrats, some Indian guy saw her too.
The core piece of information that is missing or unclear is whether this collection happens also when not actively and knowingly sending data to the cloud.
The glasses let me record videos locally, can Facebook see any frames of them? This is the question that needs to be answered. Everything is else is nonsense like "omg Amazon hears what I tell Alexa"
robotburrito
Often I hear. “these are so cool! It’s a shame meta makes them.” All of their pivots will fail because of their track record IMo.
KaiserPro
What I don't understand is where this data is coming from. Is it actually Meta's raybans or is it project aria (https://www.projectaria.com/)
Because I didn't think that the data was uploaded to meta by default, when you take a video with the raybans.
More over, I didn't think that those glasses could record more than 2.5 minutes.
The point still remains, the devil in detail of the "privacy" policy.
nothrowaways
The whole project is a Creepy privacy nightmare.
zouhair
I am confused that people act so offended. What do people expect? Meta and all tech companies do not believe in privacy for anyone but themselves. Isn't our info how they make their money?
Mulhimfy
The privacy implications here go beyond just the recording
light. The real concern is how Meta trains AI models on
captured data without explicit consent — and most users
have no idea this is happening.
xmx98
Of course! Glasses with cameras are a classic secret spy gadget :)
arian_
Workers can see everything" means this isn't an AI privacy problem. It's a surveillance-as-a-service problem with extra steps.
impossiblefork
While it may be legal for an individual to film something, it is certainly not permissible to process video data of this sort at scale.
I don't agree that responsibility to comply with Swedish law is on the wearer. This should motivate prosecutors to immediately order raids to secure any data relating to the processing of the data.
I also think the Swedish camera surveillance law is also applicable and there's a deceptive element since the cameras are disguised as glasses.
showerst
How does this not fall afoul of states with two party consent laws around recording conversations? Particularly since California is one of the strictest states.
show comments
de6u99er
It's the same issue with Tesla collecting camera feeds through their cars to use it for macbine learning.
Those videos can also be a used to track people. IMHO each Tesla owner sending video data to Tesla's data centers is violating privacy laws!
show comments
FireSquid2006
I'm not sure if there is any use case that could convince me to mount an internet connected device to my head at all times.
show comments
bryanrasmussen
smart glasses are a potential great boon for mankind, really, only both of the iterations we have had have been from two companies that are arguably detrimental to humanity.
smbullet
Hopefully this causes Meta to be more transparent about what data is sent to their annotators. It seems like even the annotators didn't know whether the person explicitly hit recorded (whether accidentally or not) or if it's samples from a constant stream. This kind of makes it impossible for anyone to consent to the purchase agreements.
roughly
Everything else in this article is horrific, but this stuck out to me:
> “The algorithms sometimes miss. Especially in difficult lighting conditions, certain faces and bodies become visible”.
Right, “difficult lighting conditions,” not sure when we’d run into those in situations where we might be concerned with privacy. A 97% success rate looks good on paper.
show comments
nomilk
Is it paranoid to assume every device with a camera/mic can see/hear everything?
That's my default assumption.
umpalumpaaa
The title is now “She Came Out of the Bathroom Naked, Employee Says”
yalogin
Of course they can, why would one expect anything else? However if you look through their processes I am sure they are covered by some legal jargon to do the bare minimum in terms of security. They will have every knob available to debug to the lowest level possible and view everything
dlev_pika
Crazy to have 1 trillion invested in data centers, underpinned by dollar-a-day human turk ops
show comments
aucisson_masque
Beside the privacy part, I fail to see what value these glasses bring that a smartphone with a camera can't do already ?
And you're still forced to carry a smartphone anyway with these glasses since they require internet connection.
Is this fashion, or something I'm not aware of ? They look horrendous to me.
show comments
stavros
What the hell? I thought the videos went to the phone directly, they're all getting uploaded to Meta? I don't know why I let my guard down against that company for one second.
EDIT: Wait, is this when you use the "ask Meta" feature? I do expect that to send all the clips to a server for an LLM to process, it's not done on-device. It's not clear to me whether it's that or just all videos/photos you record with the glasses.
breve
Meta's business model is premised on intensive and pervasive user surveillance.
When you use Meta's products and services you are tagged, tracked, and commodified like an animal. You are cattle.
The question isn't whether or not Meta's AI smart glasses raise data privacy concerns.
The question is why use anything from Meta in the first place?
show comments
sidcool
Despite the historical misadventures of Meta, if people still use their products with an expectation of privacy, it's on the people.
Murfalo
Surely this is already happening with our other devices? Not that it isn't a problem but that the game is already lost...?
mjbonanno
The privacy angle here is fascinating. Curious if anyone has tried running the on-device model locally yet?
stevefan1999
I would really love to use smart glasses for DevOps, especially Grafana dashboards
arcadianalpaca
The recording light argument keeps coming up but I don't buy it. I can't tell if someone's glasses have a tiny LED on from across a room, and neither can anyone else.Under GDPR it's a on Meta to handle consent, not on me to squint at someone's face to figure out if I'm being filmed.
show comments
thunderfork
Fun fact: all advertiser chat support agents at Meta used to (still might) have full super-read on FB. When you read "workers" in this headline, don't think "devs", think "legions of contracted-out T1 support staff"
show comments
DavidPiper
Ah yes, while everyone was focused on Flock cameras...
For many more reasons than pervert behaviour, I agree that this kind of tool cannot coexist with healthy society. "Glassholes" was a delightful portmanteau, but I suspect normalising a term like "pedo glasses" will probably put people off them way sooner and faster. At the very least it identifies the product and not the person as the problem.
show comments
bys_exe
Meta glasses will scare people in public because they think they are being recorded even though they are not..
diacritical
I'm against surveillance in general and I see many people being against these glasses, yet not caring at all about surveillance cameras. Flock in the USA is a bit of an outlier in that it got some people riled up, but where I live in Europe there are private cameras looking out of at least half of the buildings, maybe more. So if you're walking down the street for 15 minutes, you'd be caught by tens or hundreds of cameras from various manufacturers, installed by various business and homes. Who knows how many have microphones, which server they store their feed in, what security each cam has and so on.
I asked 2 cops in a patrol car if I could install cameras on my own and how I should go about it. They said they don't mind them. Officially it's illegal unless you have a permit, but it's so widespread and the law is so unenforced that it's practically 99.99% legal.
I can point a few cameras to the street and record everything 24/7. When I'm on a bus I'm being recorded by a few cameras. On most bus/tram/subway stops there are cameras. In stores and public buildings there are cameras. Most cars have cameras for insurance or general safety concerns. Self-driving cars would have to have cameras, as well as delivery robots.
If we accept this shitty reality, why shouldn't I wear a camera and a mic, too?
show comments
lenerdenator
> Meta
> Privacy
Pick one.
TowerTall
There must be a special place reserved for Mark Zuckerberg in hell
pbmonster
Interesting article, but I wonder why the journalists didn't go all the way. Sure, Meta isn't going to comment when you ask them what data they have. But this is in the EU, just hit them with a Subject Access Request under GDPR.
Would be really interesting to create a completely new account, use the glasses with all upload settings off for a month, and then SAR request and see what they have...
medi8r
This simply needs to be criminalized.
Basicially it is a peeping tom.
ripped_britches
Too funny that the subcontractor working for meta is “sama”
nkrisc
Not all technology is good.
guywithahat
To their credit, all interesting consumer tech introduces privacy concerns because to do interesting things you usually have to integrate tightly with someone's life. I'm just not that concerned about filming sunglasses, in fact I'm kind of excited
instahotstar
These glasses are godd one
giwook
What kinds of defensive measures can you even take against such a blatant and yet inevitable invasion of privacy that don't involve you just completely covering your face whenever you go out in public?
Schlagbohrer
I look forward to the day when I can have a fully FOSS, trustworthy pair of smart glasses, made by people who genuinely want to and do put user privacy first. But until then, no fucking way. I don't even like keeping my cellphone in the same room as me when I'm at home.
unselect5917
"People just submitted it. I don't know why. They 'trust me'. Dumb fucks."
-Mark Zuckerberg, 2004
show comments
yogorenapan
The annoying thing is that even if you yourself don't use these glasses, as long as people around you do, you are still affected by it. We really need laws to limit always-on recording devices in places where we have an expectation of privacy.
show comments
cl0ckt0wer
Oh look a flock competitor
zombot
Who would even expect any privacy from the Facebook mafia? The rename to "Meta" doesn't obscure the fact that they are bottomfeeders.
oldfuture
this should be known by everyone
rodwyersoftware
If you're in public you have no privacy by default.
show comments
msy
You would have to have been hiding under an extremely large rock not to assume this given the technology involved and Meta's overtly and consistently anti-privacy stances and history.
show comments
iJohnDoe
FTA > "I saw a video where a man puts the glasses on the bedside table and leaves the room. Shortly afterwards his wife comes in and changes her clothes." "The workers describe videos where people’s bank cards are visible by mistake."
This is hugely concerning. We need more details. Why are the glasses recording when not being worn? Is the light on when it's recording?
Are the Meta employees able to turn on the streaming without people knowing? Are these videos only when someone says "Hey Meta..."? Are the Meta employees looking at every "Hey Meta..." video where someone asks AI a question?
These glasses are considered a luxury item and are worn by executives in office environments. They are worn by people in family situations. Someone could be a confidential or private moment and randomly ask AI a question; one of the primary purposes of the glasses. Are all of these being seen by Meta employees?
jcgrillo
It's genuinely uncanny to see good tech journalism.. it's normally so much worse than this
show comments
kgwxd
Why in the world did they even try this again? What market is there for it beyond creeps? Or is that the hot thing right now?
show comments
GuinansEyebrows
“I saw a video where a man puts the glasses on the bedside table and leaves the room.”
“Shortly afterwards his wife comes in and changes her clothes”, one of them says.
based on this and other context in the article, it seems like there's a very realistic chance that Meta is in possession of and actively distributing (internally and to contractors) video content of minors. i wonder if any contractors have confirmed this or have been unwillingly (or worse) exposed to this.
jotux
Meta needs to make a find-your-lost-dog commercial for their smart glasses ASAP.
show comments
kgarten
I think this coverage feels very similar to the way Google Glass was treated back in the early 2010s ... there’s a grain of legitimate concern, but the article oversells what these glasses actually do and stokes alarm in a way that goes beyond the available facts.
Workers annotating data for AI might see sensitive content captured by smart glasses. But the leap from that to “we see everything” and framing it like some dystopian panopticon mirrors the early Google Glass panic, where the concerns often outran what the device actually could do.
Legitimate concerns shouldn’t be dismissed, but neither should they be inflated to create a new “Glass-forked-into-Big-Brother” narrative unless the evidence genuinely supports that level of risk ...
wahnfrieden
Post titled has been repeatedly edited to make it vague and to remove all content of the concern
Actual title is “She Came Out of the Bathroom Naked, [Meta] Employee Says” and subtitle begins with “Bank details, sex and naked people who seem unaware they are being recorded”
Suspicious moderation behaviors on this one
Juliate
Likewise, are there any startup for wearable devices that visually jam or impair digital cameras?
rsynnott
Can we just get Robert Scoble to come back and destroy these, please?
camillomiller
I already personally refuse to be around anyone who wears them. And I think establishments should just outright ban them.
some_furry
Good reporting, but this has always been Meta's M.O. so I'm really not surprised.
The sooner we collectively stop trusting them (and maybe even actively campaign to have the U.S. government meaningfully regulate them), the better.
Personally, I would like to see the company stop existing and its executive board destitute.
guelo
Those glasses have a tiny white led when the camera is on. It really needs to be more obvious. This might be something we'll need legislation for since Meta is an evil-ish immoral company.
show comments
ncr100
Just think of the children. Changing a soiled garment, transmitting video of the whole ordeal, isn't that super illegal?
show comments
pier25
I really hope these flop and don’t become mainstream.
It would be a surveillance and privacy dystopian nightmare.
lvl155
Only Meta and Zuck would continually introduce invasive products.
3efr4444444
Retest
nosequel
I won't even walk into a house with Alexa devices around, there is no way I'm going to let Meta glasses be in the same room as me.
show comments
cubefox
The article is somewhat disingenuous because it "forgets" to mention the bright LED on the glasses while filming. This makes statements in the article that people don't know about video recording much less believable.
tim-tday
Color me shocked.
yieldcrv
"my spying glasses are spying on me"
31337Logic
Holy shit! This is absolutely despicable and probably the worst tech news I've read all year. Why do people still support Meta/Facebook?!?!
maximinus_thrax
I love the Facebook glasses, they seem to be the swan song of a shitty company. Young people have abandoned Facebook when their parents started hanging out, now it's all boomers and bots posting conspiracy theories.
If they think this surveillance tech is going to push the company forward, it means leadership is even more disconnected from reality than the Amazon people who greenlit the superbowl ad. It means the company is dying. Huzzah!
jbxntuehineoh
On an unrelated note, the FT reported today [1] that Israel was able to track Iranian leadership by hacking "nearly all" of the traffic cameras in Tehran. Anyways, I think we should continue to put as many networked cameras, microphones, and other sensors in as many products as possible. There are no downsides!
Why was the title changed from "The workers behind Meta’s smart glasses can see everything" to "A hidden workforce behind Meta’s new smart glasses"? It doesn't go against any guidelines:
> Please don't do things to make titles stand out, like using uppercase or exclamation points, or saying how great an article is. It's implicit in submitting something that you think it's important.
> If the title includes the name of the site, please take it out, because the site name will be displayed after the link.
> If the title contains a gratuitous number or number + adjective, we'd appreciate it if you'd crop it. E.g. translate "10 Ways To Do X" to "How To Do X," and "14 Amazing Ys" to "Ys." Exception: when the number is meaningful, e.g. "The 5 Platonic Solids."
> Otherwise please use the original title, unless it is misleading or linkbait; don't editorialize.
Has the submission title just been editorialized? I swear I’ve seen it mentioning data collection before, now it’s just bland.
show comments
deaux
Page title - Meta’s AI Smart Glasses and Data Privacy Concerns: Workers Say “We See Everything”
Original HN title - The workers behind Meta’s smart glasses can see everything
Editorialized HN title v1, 7 hours after post - A hidden workforce behind Meta’s new smart glasses
Editorialized HN title v2 - Meta’s AI smart glasses and data privacy concerns
show comments
sharkweek
I really want to make a fake PSA that suggests anyone wearing the Meta glasses is probably a pervert and should be proactively avoided/shunned.
This product cannot be allowed to exist in the type of world I want to live in.
The power structure wants these to succeed in the market for so many horrific reasons and it will require some serious societal muscle to reject them.
show comments
sschueller
Of course, why wouldn't they? They do not work without a meta account. /s
Is anyone at meta going to be bald accountable?
An absolute privacy nightmare especially in places like Switzerland or Germany where recording people (subject focus) even in public is not permitted without consent but you have tourists now showing up everywhere wearing these.
The LED is barely visible during the day and some have modified their glasses to disable/remove it.
show comments
tomkarho
There are no privacy concerns because there IS no privacy. /s
webdevver
i mean theres kind of no way around it. how else are you gonna get the training data you need? the only way to bootstrap ai is to tag the data with bio-ai first (humans).
different companies 'launder' it differently: with voice, it was done by "accidental" voice assistant activations. i guess with glasses, maybe there will be less window dressing this time. after all, it is clearly pitched to see what you see, at all times of the day.
similar controversy happened with the various roomba products, although arguably that was a combination of data harvesting + lazy engineering.
show comments
diego_moita
[flagged]
show comments
pstoll
TLDR the recorded media isn’t end-to-end encrypted and they aren’t selling it but instead using it to train their own systems. What is new here?
rr808
I think they're dumb but my wife loves them. The video quality is surprisingly good.
socalgal2
Hilarious that a post about collecting data is on a site that collects data
I do think it's completely unacceptable if Meta makes the glasses unable to be used for routine functions without (a) other humans reviewing your private content and (b) AI training on your content. There needs to be total transparency to people when this is happening - these are absolutes.
But I'm a bit confused by the article because it describes things that seem really unlikely given how the glasses work. They shine a bright light whenever recording. Are people really going into bathrooms, having sex, sharing rooms with people undressed while this light is on? Or is this deliberate tampering, malfunctioning, or Meta capturing footage without activating the light (hard to believe even Meta would do this intentionally).
I'll confess that I like my Meta Ray Ban glasses: I love using them to listen to podcasts at the pool/beach, while riding my bike, and it's cool to snap a quick picture of my kids without pulling out my phone.
I wish this article (or Meta) were a bit clearer about the specific connection between the device settings and use and when humans get access to the images.
My settings are:
- [OFF] "Share additional data" - Share data about your Meta devices to help improve Meta products.
- [OFF] "Cloud media" - Allow your photos and videos to be sent to Meta's cloud for processing and temporary storage.
I'm not sure whether my settings would prevent my media from being used as described in the article.
Also, it's not clear which data is being used for training:
- random photos / videos taken
- only use of "Meta AI" (e.g., "Hey Meta, can you translate this sign")
As much as I've liked my Meta Ray Ban's I'm going to need clarity here before I continue using them.
TBH, if it were only use of Meta AI, I'd "get it" but probably turn that feature off (I barely use it as-is).
Don't you need to obtain consent before filming random people in the street? I already feel uncomfortable when someone takes a photo in public and I happen to be in it, but this type of device takes things to an entirely different level. With smart glasses, there's no visible cue that you're being recorded. No phone held up, no camera in sight. I'm questioning the legality of this in Europe, where privacy laws tend to be stricter. In the meantime, should I just assume that anyone wearing these glasses is always filming? And would I be within my rights to ask them to stop the moment I notice them?
Ghanaian authorities are seeking the arrest of a Russsian national who was using glasses to record himself picking up, and sleeping with, women in Ghana and Kenya. He uploaded them to social media and telegram. Was quite the story on African tech twitter last month.
https://www.bbc.com/news/articles/c9wn5p299eko
Everyone should assume that _anything_ connected to internet will get uploaded to internet and someone within the company will have permission to review the contents regardless of what the policy says.
1. Debugging for troubleshooting.
2. Analytical for making product better.
3. Bugs that collects your info when it shouldn't.
4. Bugs from 3rd party vendor if company uses those.
5. Insecure process. Getting access to a private content within the company is trivial due to coarse permission model.
Source: I worked at two well known social media companies. Trust & Safety and data infra teams
I was in engineering school back in ~2012 when Google Glass came out. One of my classmates got hold of a pair when they were still quite uncommon and wore them to an extracurricular club meeting. Within minutes someone made a comment about him wearing the "creeper" glasses and asked if he was filming. He never wore them to the club again.
I just don't see a world where that doesn't happen with Meta glasses.
My concern was whether the glasses might record or transmit data while switched off or in standby mode. From what I can tell, they don’t do this intentionally. So the risk is broadly similar to other modern electronic devices.
The creepiness concern is real, but I think people misplace where the actual surveillance happens. The most consequential stores of personal data aren’t ad networks they’re things like banks, hospitals, insurers, and telecoms. These institutions hold information about your health, finances, movements, and relationships, indexed and searchable by employees you’ve never met, governed by policies you’ve never read.
Realistically, there’s very little an individual can do to completely opt out.
My take is: if the main outcomes are that I get shown ads for things I don’t need and my facecomputer knows the difference between a fork and a spoon… I… I can live with that.
As a dev, I think the core issue isn’t whether one indicator LED can be bypassed — motivated people can bypass almost any client-side control. The trust boundary is policy + defaults. If enabling “AI features” implicitly authorizes broad retention/review, users won’t understand the tradeoff until after the fact.
A better pattern would be tiered modes with explicit UX: local-only capture, cloud processing without retention, and opt-in retention/training with visible status. If the product can’t technically support that separation today, that limitation should be stated plainly in setup, not buried in policy docs.
Sooo... I really should start keepin running this[1] all the time...
https://github.com/yjeanrenaud/yj_nearbyglasses/
I do not care about the privacy of people who buy these glasses nor their families.
I care about the innocent people whose privacy is invaded by people who buy these glasses.
I am so far removed from the type of person who might consider buying something like that. You'd have to be exceptionally impervious to social cues to even think of wearing that in public.
If you're blind, it's of course understandable but that's pretty much it in terms of cases in which I would consider the glasses acceptable to wear in public.
I sincerely hope someone in Japan or Korea get caught using those to peek under trousers on the train so it get the forced camera sound treatment of smartphones over there.
So the world can label them as Hentai glasses and move on
I got a pair as a gift and didn't look much into them but I have to be honest, I assumed any data I captured - voice, video, etc. - would be sent to their servers (to use their models) and they'd be using it for training with humans in the loop.
Tbh the only thing I really use the glasses for are listening to music or talking on the phone - so basically how you'd use airpods. I don't use airpods because I had an ear injury that prevents me from using them on my left ear, so these glasses were kinda nice for that. I really wish they didn't have a camera though because I do always feel compelled to remove them if I interact with people.
I also have to add that the quality is mediocre. They're a month old and the case has problems charging sometimes, and one of the screws is always coming loose at a hinge no matter how often I retighten that side.
Is anyone here actually surprised Meta is recording and reviewing their content?
Vote with your dollars people.
Usage of creep-ware won't be tolerated in the social groups I take part on.
We will shame hard anyone who uses this sh1t.
Conditioning the crowd gradually towards being monetised in some of the most egregious ways - first pay for the glasses, then pay with revelations of private life sold to government (ICE?), business (private insurance) and so on. Super evil.
And despite this, there is no strong will to detach from what they produce - in the beginning or later when it is considered like cultural fabric. That’s how good their tactics is.
And for the pay one gets working for them - screw the world! I won’t use it anywhere near my loved ones - but will build it
We need safe spaces where you aren't constantly living inside the panopticon...
Well - don't wear their spyglasses. It's really not that hard.
You can still record stuff without spyglasses. People do that on youtube too, e. g. first amendment audits. It's not that different to the spyglasses, except that you can cut off Meta from the process (admittedly youtube creates another problem which is called Google; it would be nice if we could have platforms without corporate overlord, but the financial aspect may still be an issue that requires solving. I don't have a good way to solve that, as I am also having a 100% zero ads policy aka using ublock origin mandatorily. And Google declared total war againts ublock origin, we all know that.)
At a friend's party recently, I met someone who told me that they had worked in data for Meta's glasses division and warned me never to get Meta glasses for this very reason—that the workers can see everything. They told me of a comical case where a guy pulled down his pants to look at his penis, asked "Meta, what is this?", and the AI responded that it was a thumb. XD
Privacy policies and usage terms are like the magic wand of the industry. Whatever totally bad they want to do and however they want to abuse of you and of your data, they just have to add a few unreadable lines in a 40 pages document and that's it.
No one will read it, but even if you do, most of the time the FOMO or sunk cost fallacy effect will make you go on anyway. And then it is a free pass for them.
To a technical person, this is obvious. AI doesn't happen on the glasses, it doesn't happen on your crappy phone, it happens online. Live streaming, which is also a feature, by definition sends everything it captures to someone else's computer (ahem, the cloud).
Yesterday I saw a Instagram reel of a guy asking "what am I looking at" while between his girlfriend's legs. Congrats, some Indian guy saw her too.
The core piece of information that is missing or unclear is whether this collection happens also when not actively and knowingly sending data to the cloud.
The glasses let me record videos locally, can Facebook see any frames of them? This is the question that needs to be answered. Everything is else is nonsense like "omg Amazon hears what I tell Alexa"
Often I hear. “these are so cool! It’s a shame meta makes them.” All of their pivots will fail because of their track record IMo.
What I don't understand is where this data is coming from. Is it actually Meta's raybans or is it project aria (https://www.projectaria.com/)
Because I didn't think that the data was uploaded to meta by default, when you take a video with the raybans.
More over, I didn't think that those glasses could record more than 2.5 minutes.
The point still remains, the devil in detail of the "privacy" policy.
The whole project is a Creepy privacy nightmare.
I am confused that people act so offended. What do people expect? Meta and all tech companies do not believe in privacy for anyone but themselves. Isn't our info how they make their money?
The privacy implications here go beyond just the recording light. The real concern is how Meta trains AI models on captured data without explicit consent — and most users have no idea this is happening.
Of course! Glasses with cameras are a classic secret spy gadget :)
Workers can see everything" means this isn't an AI privacy problem. It's a surveillance-as-a-service problem with extra steps.
While it may be legal for an individual to film something, it is certainly not permissible to process video data of this sort at scale.
I don't agree that responsibility to comply with Swedish law is on the wearer. This should motivate prosecutors to immediately order raids to secure any data relating to the processing of the data.
I also think the Swedish camera surveillance law is also applicable and there's a deceptive element since the cameras are disguised as glasses.
How does this not fall afoul of states with two party consent laws around recording conversations? Particularly since California is one of the strictest states.
It's the same issue with Tesla collecting camera feeds through their cars to use it for macbine learning.
Those videos can also be a used to track people. IMHO each Tesla owner sending video data to Tesla's data centers is violating privacy laws!
I'm not sure if there is any use case that could convince me to mount an internet connected device to my head at all times.
smart glasses are a potential great boon for mankind, really, only both of the iterations we have had have been from two companies that are arguably detrimental to humanity.
Hopefully this causes Meta to be more transparent about what data is sent to their annotators. It seems like even the annotators didn't know whether the person explicitly hit recorded (whether accidentally or not) or if it's samples from a constant stream. This kind of makes it impossible for anyone to consent to the purchase agreements.
Everything else in this article is horrific, but this stuck out to me:
> “The algorithms sometimes miss. Especially in difficult lighting conditions, certain faces and bodies become visible”.
Right, “difficult lighting conditions,” not sure when we’d run into those in situations where we might be concerned with privacy. A 97% success rate looks good on paper.
Is it paranoid to assume every device with a camera/mic can see/hear everything?
That's my default assumption.
The title is now “She Came Out of the Bathroom Naked, Employee Says”
Of course they can, why would one expect anything else? However if you look through their processes I am sure they are covered by some legal jargon to do the bare minimum in terms of security. They will have every knob available to debug to the lowest level possible and view everything
Crazy to have 1 trillion invested in data centers, underpinned by dollar-a-day human turk ops
Beside the privacy part, I fail to see what value these glasses bring that a smartphone with a camera can't do already ?
And you're still forced to carry a smartphone anyway with these glasses since they require internet connection.
Is this fashion, or something I'm not aware of ? They look horrendous to me.
What the hell? I thought the videos went to the phone directly, they're all getting uploaded to Meta? I don't know why I let my guard down against that company for one second.
EDIT: Wait, is this when you use the "ask Meta" feature? I do expect that to send all the clips to a server for an LLM to process, it's not done on-device. It's not clear to me whether it's that or just all videos/photos you record with the glasses.
Meta's business model is premised on intensive and pervasive user surveillance.
When you use Meta's products and services you are tagged, tracked, and commodified like an animal. You are cattle.
The question isn't whether or not Meta's AI smart glasses raise data privacy concerns.
The question is why use anything from Meta in the first place?
Despite the historical misadventures of Meta, if people still use their products with an expectation of privacy, it's on the people.
Surely this is already happening with our other devices? Not that it isn't a problem but that the game is already lost...?
The privacy angle here is fascinating. Curious if anyone has tried running the on-device model locally yet?
I would really love to use smart glasses for DevOps, especially Grafana dashboards
The recording light argument keeps coming up but I don't buy it. I can't tell if someone's glasses have a tiny LED on from across a room, and neither can anyone else.Under GDPR it's a on Meta to handle consent, not on me to squint at someone's face to figure out if I'm being filmed.
Fun fact: all advertiser chat support agents at Meta used to (still might) have full super-read on FB. When you read "workers" in this headline, don't think "devs", think "legions of contracted-out T1 support staff"
Ah yes, while everyone was focused on Flock cameras...
For many more reasons than pervert behaviour, I agree that this kind of tool cannot coexist with healthy society. "Glassholes" was a delightful portmanteau, but I suspect normalising a term like "pedo glasses" will probably put people off them way sooner and faster. At the very least it identifies the product and not the person as the problem.
Meta glasses will scare people in public because they think they are being recorded even though they are not..
I'm against surveillance in general and I see many people being against these glasses, yet not caring at all about surveillance cameras. Flock in the USA is a bit of an outlier in that it got some people riled up, but where I live in Europe there are private cameras looking out of at least half of the buildings, maybe more. So if you're walking down the street for 15 minutes, you'd be caught by tens or hundreds of cameras from various manufacturers, installed by various business and homes. Who knows how many have microphones, which server they store their feed in, what security each cam has and so on.
I asked 2 cops in a patrol car if I could install cameras on my own and how I should go about it. They said they don't mind them. Officially it's illegal unless you have a permit, but it's so widespread and the law is so unenforced that it's practically 99.99% legal.
I can point a few cameras to the street and record everything 24/7. When I'm on a bus I'm being recorded by a few cameras. On most bus/tram/subway stops there are cameras. In stores and public buildings there are cameras. Most cars have cameras for insurance or general safety concerns. Self-driving cars would have to have cameras, as well as delivery robots.
If we accept this shitty reality, why shouldn't I wear a camera and a mic, too?
> Meta
> Privacy
Pick one.
There must be a special place reserved for Mark Zuckerberg in hell
Interesting article, but I wonder why the journalists didn't go all the way. Sure, Meta isn't going to comment when you ask them what data they have. But this is in the EU, just hit them with a Subject Access Request under GDPR.
Would be really interesting to create a completely new account, use the glasses with all upload settings off for a month, and then SAR request and see what they have...
This simply needs to be criminalized.
Basicially it is a peeping tom.
Too funny that the subcontractor working for meta is “sama”
Not all technology is good.
To their credit, all interesting consumer tech introduces privacy concerns because to do interesting things you usually have to integrate tightly with someone's life. I'm just not that concerned about filming sunglasses, in fact I'm kind of excited
These glasses are godd one
What kinds of defensive measures can you even take against such a blatant and yet inevitable invasion of privacy that don't involve you just completely covering your face whenever you go out in public?
I look forward to the day when I can have a fully FOSS, trustworthy pair of smart glasses, made by people who genuinely want to and do put user privacy first. But until then, no fucking way. I don't even like keeping my cellphone in the same room as me when I'm at home.
"People just submitted it. I don't know why. They 'trust me'. Dumb fucks."
-Mark Zuckerberg, 2004
The annoying thing is that even if you yourself don't use these glasses, as long as people around you do, you are still affected by it. We really need laws to limit always-on recording devices in places where we have an expectation of privacy.
Oh look a flock competitor
Who would even expect any privacy from the Facebook mafia? The rename to "Meta" doesn't obscure the fact that they are bottomfeeders.
this should be known by everyone
If you're in public you have no privacy by default.
You would have to have been hiding under an extremely large rock not to assume this given the technology involved and Meta's overtly and consistently anti-privacy stances and history.
FTA > "I saw a video where a man puts the glasses on the bedside table and leaves the room. Shortly afterwards his wife comes in and changes her clothes." "The workers describe videos where people’s bank cards are visible by mistake."
This is hugely concerning. We need more details. Why are the glasses recording when not being worn? Is the light on when it's recording?
Are the Meta employees able to turn on the streaming without people knowing? Are these videos only when someone says "Hey Meta..."? Are the Meta employees looking at every "Hey Meta..." video where someone asks AI a question?
These glasses are considered a luxury item and are worn by executives in office environments. They are worn by people in family situations. Someone could be a confidential or private moment and randomly ask AI a question; one of the primary purposes of the glasses. Are all of these being seen by Meta employees?
It's genuinely uncanny to see good tech journalism.. it's normally so much worse than this
Why in the world did they even try this again? What market is there for it beyond creeps? Or is that the hot thing right now?
Meta needs to make a find-your-lost-dog commercial for their smart glasses ASAP.
I think this coverage feels very similar to the way Google Glass was treated back in the early 2010s ... there’s a grain of legitimate concern, but the article oversells what these glasses actually do and stokes alarm in a way that goes beyond the available facts.
Workers annotating data for AI might see sensitive content captured by smart glasses. But the leap from that to “we see everything” and framing it like some dystopian panopticon mirrors the early Google Glass panic, where the concerns often outran what the device actually could do.
Legitimate concerns shouldn’t be dismissed, but neither should they be inflated to create a new “Glass-forked-into-Big-Brother” narrative unless the evidence genuinely supports that level of risk ...
Post titled has been repeatedly edited to make it vague and to remove all content of the concern
Actual title is “She Came Out of the Bathroom Naked, [Meta] Employee Says” and subtitle begins with “Bank details, sex and naked people who seem unaware they are being recorded”
Suspicious moderation behaviors on this one
Likewise, are there any startup for wearable devices that visually jam or impair digital cameras?
Can we just get Robert Scoble to come back and destroy these, please?
I already personally refuse to be around anyone who wears them. And I think establishments should just outright ban them.
Good reporting, but this has always been Meta's M.O. so I'm really not surprised.
The sooner we collectively stop trusting them (and maybe even actively campaign to have the U.S. government meaningfully regulate them), the better.
Personally, I would like to see the company stop existing and its executive board destitute.
Those glasses have a tiny white led when the camera is on. It really needs to be more obvious. This might be something we'll need legislation for since Meta is an evil-ish immoral company.
Just think of the children. Changing a soiled garment, transmitting video of the whole ordeal, isn't that super illegal?
I really hope these flop and don’t become mainstream.
It would be a surveillance and privacy dystopian nightmare.
Only Meta and Zuck would continually introduce invasive products.
Retest
I won't even walk into a house with Alexa devices around, there is no way I'm going to let Meta glasses be in the same room as me.
The article is somewhat disingenuous because it "forgets" to mention the bright LED on the glasses while filming. This makes statements in the article that people don't know about video recording much less believable.
Color me shocked.
"my spying glasses are spying on me"
Holy shit! This is absolutely despicable and probably the worst tech news I've read all year. Why do people still support Meta/Facebook?!?!
I love the Facebook glasses, they seem to be the swan song of a shitty company. Young people have abandoned Facebook when their parents started hanging out, now it's all boomers and bots posting conspiracy theories.
If they think this surveillance tech is going to push the company forward, it means leadership is even more disconnected from reality than the Amazon people who greenlit the superbowl ad. It means the company is dying. Huzzah!
On an unrelated note, the FT reported today [1] that Israel was able to track Iranian leadership by hacking "nearly all" of the traffic cameras in Tehran. Anyways, I think we should continue to put as many networked cameras, microphones, and other sensors in as many products as possible. There are no downsides!
[1] https://archive.is/QSCjf
Why was the title changed from "The workers behind Meta’s smart glasses can see everything" to "A hidden workforce behind Meta’s new smart glasses"? It doesn't go against any guidelines:
> Please don't do things to make titles stand out, like using uppercase or exclamation points, or saying how great an article is. It's implicit in submitting something that you think it's important.
> If the title includes the name of the site, please take it out, because the site name will be displayed after the link.
> If the title contains a gratuitous number or number + adjective, we'd appreciate it if you'd crop it. E.g. translate "10 Ways To Do X" to "How To Do X," and "14 Amazing Ys" to "Ys." Exception: when the number is meaningful, e.g. "The 5 Platonic Solids."
> Otherwise please use the original title, unless it is misleading or linkbait; don't editorialize.
The literal URL slug is
> metas-ai-smart-glasses-and-data-privacy-concerns-workers-say-we-see-everything
The page title is
> Meta’s AI Smart Glasses and Data Privacy Concerns: Workers Say “We See Everything”
The new title goes against the guidelines by editorializing. I've never seen HN do this before, what's going on here?
Brought to you by the CEO that tapes the webcam on his laptop
https://www.theguardian.com/technology/2016/jun/22/mark-zuck...
Has the submission title just been editorialized? I swear I’ve seen it mentioning data collection before, now it’s just bland.
Page title - Meta’s AI Smart Glasses and Data Privacy Concerns: Workers Say “We See Everything”
Original HN title - The workers behind Meta’s smart glasses can see everything
Editorialized HN title v1, 7 hours after post - A hidden workforce behind Meta’s new smart glasses
Editorialized HN title v2 - Meta’s AI smart glasses and data privacy concerns
I really want to make a fake PSA that suggests anyone wearing the Meta glasses is probably a pervert and should be proactively avoided/shunned.
This product cannot be allowed to exist in the type of world I want to live in.
The power structure wants these to succeed in the market for so many horrific reasons and it will require some serious societal muscle to reject them.
Of course, why wouldn't they? They do not work without a meta account. /s
Is anyone at meta going to be bald accountable?
An absolute privacy nightmare especially in places like Switzerland or Germany where recording people (subject focus) even in public is not permitted without consent but you have tourists now showing up everywhere wearing these.
The LED is barely visible during the day and some have modified their glasses to disable/remove it.
There are no privacy concerns because there IS no privacy. /s
i mean theres kind of no way around it. how else are you gonna get the training data you need? the only way to bootstrap ai is to tag the data with bio-ai first (humans).
different companies 'launder' it differently: with voice, it was done by "accidental" voice assistant activations. i guess with glasses, maybe there will be less window dressing this time. after all, it is clearly pitched to see what you see, at all times of the day.
similar controversy happened with the various roomba products, although arguably that was a combination of data harvesting + lazy engineering.
[flagged]
TLDR the recorded media isn’t end-to-end encrypted and they aren’t selling it but instead using it to train their own systems. What is new here?
I think they're dumb but my wife loves them. The video quality is surprisingly good.
Hilarious that a post about collecting data is on a site that collects data