This is going to be a huge chilling factor for employees. You’d no longer be able to disent, or discuss anything non-work related with even the slightest expectation of privacy.
Yes they could have accessed logs before but there’s a difference between directed checking after incidents and active surveillance at scale.
show comments
lukeschlather
I really don't understand how this is legal. I guess Facebook maybe doesn't actually have any compliance requirements in the USA, but time series screenshots of any SRE's screen are going to contain data that should not be stored by some data vacuum. I know Meta has a reputation for shitty data handling practices and US regulations are light compared to Europe, but how are they planning on securing passwords, encryption keys, PII, etc. ? Can employees turn this off at their discretion? What happens if someone forgets to turn it off before they cat the companywide ssh root private key? Even setting aside legality, someone with access to this training data would have what sounds like an unacceptably broad level of access to company systems unless Facebook wants to get hacked.
show comments
Avicebron
Yeah, this is crazy, remember when engineers were actually engineers and that meant something? Imagine asking to install spyware on your lawyers' firms' company laptops because you didn't trust them not to make some deal with the judge. Or demanding 24 hour monitoring on everything a doctor does because you need to review the footage at any time.
EDIT: While we are here, let's do this for politicians as well :), publicly available, auditable 24-hour surveillance.
show comments
Xmd5a
Makes me think of how the US army trains their waterboarders: by waterboarding them first.
The goal is to manufacture a lack of empathy along the lines of: "why should I treat this person better than I was treated".
show comments
wrs
>data collected would not be used for performance assessments or any other purpose besides model training
And you expect Meta employees, of all people, to believe this?
show comments
zxc3
So, back in 2021, I supervised a student project where we aimed to simulate human interaction with the browser. Obviously, we needed data on human interaction. After discussion, we ruled out collecting data from a group because:
- the project was time constrained, so hardly any time, and
- there were serious ethical questions which could never be addressed well within the allotted time for this project
So we ended up discarding the idea of collecting data from a representative group, even before we got to the point of asking "how do you do handle that ethically". We ended up collecting data from 1 subject. The student in question, indeed. He handled the data from which he derived heuristics that simulated the data. The collected data therefore never left the student's hands.
<sarcasm>Silly us, we should have just not bothered and collected it from anyone and anywhere. Apparently.</sarcasm>
In all seriousness, this callous and complete disregard for ethical questions offends me so very much.
rkagerer
It will be interesting to see how the people who maintain (in my opinion) one of the worst offending organizations out there for invading your privacy - and generally treating you in a manner that lacks human decency - respond to having their privacy invaded, and being treated without basic decency.
I realize you can argue whatever is done at work should have no expectation of privacy, and I get that, but as an employer myself I've always felt that schemes like keyboard and mouse tracking are going a chasm too far. Your employees are human beings not robots. In the older context of corporate productivity tracking there are far better metrics available - starting with, I don't know, maybe talking to your employee and asking them how things are going.
I wouldn't have a problem if it were opt-in, but if this were foisted upon me I would surely quit.
jmull
I like to imagine they’ll mostly capture meta employees using AIs to do work.
Then they’ll deploy models trained on this, and begin capturing employees using AIs that are good at using AIs to do work.
Repeat a few times and they’ll start capturing the keystrokes from people mashing their heads into keyboards with dispair and exclaiming, “Why can’t these models do anything anymore!!”
show comments
LandenLove
Next gen AI is going to become really proficient in scrolling Hacker News.
show comments
VerifiedReports
What toxic trash.
I hope this is widely hacked. If these employees are any good, someone will whip up a countermeasure that feeds absurdly wild and nonsensical data into Meta's fetid, gaping maw.
rldjbpin
for the company that is one of the major players in tracking similar data across the web, i don't see much wrong with this.
if they continue to share their work through open releases despite the leadership change, i hope we get to benefit with their work.
not quite optimistic about the result as i wonder if on aggregate we all consistently interact with computers the most efficient way possible. maybe to beat captcha or scraper detection through mimicry perhaps.
ninjahawk1
Now this is some hacker news.
We’ve been moving towards a more and more tyrannical company controlled society for a long time and now they’re straight up doing hacking tactics to train machines to take our jobs. Doesn’t get much more bleak than that.
show comments
redleader55
I'm so happy that EU and UK have laws against this kind of thing and so I will still be able to work somewhere in the future(TBD what future means, though).
eloisius
Yesterday I was doomscrolling computer vision related stuff on LinkedIn. I hate it, but often looking for freelancing ops in CV. A video appeared in my feed of some South Asian laborers sewing in a garment factory. All of them had cameras mounted on their heads. Otherwise, they looked exactly like you’d imagine beleaguered sweatshop workers would look. Exhausted, dull expressions looking into the camera as whoever was filming the video walked by.
The presentation of the video and all the comments were on awesome cool ego-centric video understanding research that’s going to totally obsolesce human labor. I couldn’t get over how grim the video was. Here are some people in one of the least desirable positions in the world, and that’s not enough. Now they must labor without a shred of dignity, knowing they’re training their own replacements and likely not a thing they can do about it.
I’ve struggled to find enough freelance work to stay busy recently, but more than that I’m starting to feel a moral crisis. It’s getting harder and harder for me to feel like what we’re collectively doing isn’t absolutely fucked.
It seems like every tech company is moving towards the sweatshop model pioneered by CrossOver/Trilogy, treating engineers as human CPUs at best, monitored 24/7.
loeg
For context, when the article says "a list of work-related apps and websites," this includes Google properties like gmail, docs, etc, and social media websites like Facebook and Instagram, with no provision for excluding personal accounts.
show comments
eqvinox
I bet this doesn't include higher management. Wonder why.
show comments
beloch
For those saying that this is fine because company computers are company property...
This is like going to work in a drug-lab where everyone is required to strip naked to ensure no "product" can be smuggled out. It's a zero trust environment at first blush, with the added terror of it being used to replace you with AI.
People working naked in a drug lab have more job security than meta employees and an equivalent level of respect and trust from their employer. However, they can't unionize because they have no legal protections. Their employer could literally point a gun at them if they complained. That isn't the case for Meta employees. Just sayin'.
pugio
Growing up we learned about _Slaughterhouse 5_ and _Cat's Cradle_ by Kurt Vonnegut. But there's not enough discussion or awareness of _Player Piano_. Incredibly prescient. These kinds of dystopic headlines are exactly the kind of thing you'd see in the book.
show comments
mbgerring
UNIONIZE. If it’s not obvious to you now, it never will be.
show comments
fidotron
Meta going all in on their brand with this.
Someone had to do it, distasteful though it may be. Could be quite hilarious what it learns in the process.
show comments
aldielshala
Honestly, I doubt this data is as useful as they think.
Half my workday is me browsing random tabs while an AI agent does the actual work. They're going to train a model on alt-tabbing and scrolling HN/Twitter/Reddit.
stingrae
If it is available for training, I assume it is available for discovery.
atleastoptimal
Do most people who work in AI companies realize that if this buildup of reasoning models succeeds at what every tech CEO is aiming for, all of them will be out of a job?
show comments
vidarh
So happy I decline to even start the Meta interview cycles. The company seemed ridiculous even back then, but this is next level.
type0
Do they get each own Meta ray ban grasses as well that they have to wear at all time even in bathrooms?
dbgrman
Because ends justify means. To quote Boz himself:
“ The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is de facto good. It is perhaps the only area where the metrics do tell the true story as far as we are concerned.”
show comments
CarbonCycles
How is this supposed to improve productivity? I'm still struggling with the framing of the business productivity gained from this?
I will say that I feel for the folks who work at Meta...I can't help but to feel they have long jumped the shark.
vigneshwaraya
this would be a good time for Meta employees to reconsider their life choices.
motoboi
This is how anthropic captured the code agent so fast. You need training data, users are giving it to you.
Being a terminal application, all interaction is trainable signal (unlike, say, cursor, which is an IDE and let users freely explore, edit the files, move the mouse. Model sees nothing of it, nothing to train upon).
So meta is doing the obvious, we want to train a computer use model, we need training data. Better to capture from employee than buying low quality data.
At this rate they're not going to need to do layoffs.. nobody sane is going to want to work there.
vvpan
Everybody will be a serf under technofeudalism.
show comments
tasoeur
Ironically, I’d be surprised if this wasn’t already the case before? I recall vividly employment contracts with meta in 201X with a clear mention that employees were giving up any sense of privacy while using meta provided devices or entering meta’s premises…
ramon156
What would be the wxact opposite of Meta as a company? Small, privacy focused, HN blacklisted at work? Am I missing something?
Could a few of the smart people there please find a way to poison this data set? Before something similar ends up on my work computer.
zelphirkalt
There is a danger in this. Small companies with delusional people in them will see "how the big guys do it" and try to apply this kind of thing in their own little fart of a business, making our dev/engineer lives miserable.
I wonder if there's a market for a little usb fob that does nothing but meander the mouse cursor about the screen in a path that, upon proper rendering, would appear to be a ...
starkeeper
I'm so excited to interview for a career at Meta!
Also, why are the investors not suing the legs off of Zuck for the whole meta verse debacle? It is a scam and pure fraud. Also dumb name, sue for that too. Should have just renamed it meeme.
bryanrasmussen
Meta employees to start using AI to make fake mouse movements, keystrokes while goofing off.
gordon_freeman
If anyone still has not watched Severance, it is good time to start watching that show!
gip
Taking dystopia aside, without a lot more context I don't quite get how the captured data will be particularly useful to train models for say software engineering. If someone can shed light - thanks!
Now that the early 10s dev worship era is officially over, all pretensions of "making the world a better place" and being nice have been dropped and devs shall remember what it feels like to be a replaceable cog that can be swapped the way we used to do with phone wallpapers.
jtemplestein
I wonder if this screen + mouse + keyboard (+ camera + speaker + mic) interface is really the right level of abstraction to model a “digital entity”
Sure, you can do everything a human can, but it also seems VERY inefficient
As an alternative, maybe you could just do network in/out?
show comments
nafistiham
With this data, meta can make metahumans which pass recaptcha for real.
cm2012
It will be funny when the AI learns to browse Reddit and watch porn during the work day.
hintymad
Maybe this is exactly why Meta poached Alexandr Wang. Data capturing is an heirloom technique passed down from his Scale AI days
negamax
The irony of this is so strange..
napolux
fines are negligible for these companies, so i also expect these policies to be applied to eu employees without telling them
Desafinado
Honest question, does most of Meta's creepiness trickle down directly from Zuckerberg, or is their entire executive also this creepy?
Does the executive know better at this point but have toasted the culture and no one can fight against it anymore?
show comments
mandeepj
Microsoft was also doing the same in their VIVA program.
show comments
Maufrais
Seems like Skan AI's solution. They have a few Fortune 500 companies as clients doing exactly the same thing as Meta - capturing keyboard and mouse clicks to ultimately do next level process automation.
If you then think of crazy companies such as Palantir, something really has to be done about those entities. As a first step I suggest disbanding those companies, for many reasons, including wrong ethics.
zer0zzz
I definitely see a strong demand for a 11” 1kg macbook with these policies inevitably spreading.
Markoff
And here I am rejecting projects because I refuse to install on my computer closed sourced Chinese VPN my client is requiring, though I told them I could just use built-in Windows VPN or open source Hiddify.
Btw do they at least pay them extra for this spying or is it supposed to be for free? I mean if they paid at least 30-50% on top of the salary maybe I wouldn't mind doing it on dedicated meta computer.
Uptrenda
Time to leave tech forever and farm onions, I think.
smalltorch
Gotta feed the beast some how.
nektro
how to cause a mass exodus with this one simple trick!
ulfw
People are just being misused to train their own replacement.
Always thought Meta was a god awful run company and this just brings home the cake
shepherdjerred
I can’t imagine being mad that the data collection company that I work for now wants data on _me_
Really though it seems reasonable to me. They want data to train AI, and their employees are obviously a large source.
They could already track your every click. They have root on your work MacBook. Most employers do.
phendrenad2
I can't imagine a more useless dataset to collect, proving that Meta might have reached the peak of the graph of (reach/grasp)/time and the numerator is about to plummet spectacularly.
colordrops
Eventually every word spoken as well, which is already the case for most meetings, but not yet for individual interactions. Every bit of information at companies will be accessible to AI. This will allow automation all the way up to the C suite.
nemo44x
Probably aren’t seeing the promised productivity improvements of AI in terms of shipping production code and not just “super demos” that aren’t robust. So they want to see if the withers are really putting in the time or if the models struggle past a level of complexity that stalls or reverses early gains.
RobRivera
I mean - uh - gotta find all the signals that may exist.
I...admire the diligence
globular-toast
The engineers who implement this should be ashamed of themselves.
shmerl
1984 level sickening.
rvz
Meta can even afford to destroy themselves and their own employees.
More proof that they do not care about you at all. This is Meta's way of moving fast and destroying everything at all costs.
dwaltrip
Fucking insane.
Optimizing ourselves to death.
Capitalism is asleep at the wheel with its foot stuck on the gas pedal.
vrganj
Hey fellow engineers.
I know you've long been hypnotized by libertarianism and the cult of the individual.
Maybe it's time you reconsider in light of the overwhelming evidence that the capitalist class is, in fact, not your friend.
The only known way for workers to assert their rights is collective action. Alone, you are weak and replacable. Together, we are strong.
It's time for a proper tech worker's union, to give us some fangs to claw back our dignity with.
alex1138
Zuck's a sociopath
bradlys
Data collection isn’t new. The training is.
show comments
lifeisstillgood
But this is a good thing. Let me explain.
Imagine a society where an individual’s rights are prioritised and where society is dedicated to the best interests of each citizen (not desires or wants but reasonable considered best interests)
Now imagine a society where your individual daily actions are recorded, reviewed and helpfully advised upon.
Millions of people making millions of actions each day and all recorded compared and sifted for positive feedback and improvement overall.
Just how far ahead would such a society pull compared to one that stays at today’s level. Compared to one that used totalitarian methods enabled by such surveillance?
The difference between Soviet and Western Europe was not the tech, it was the trust.
If we can build a society with f trust then this tech will turbo charge us.
This is going to be a huge chilling factor for employees. You’d no longer be able to disent, or discuss anything non-work related with even the slightest expectation of privacy.
Yes they could have accessed logs before but there’s a difference between directed checking after incidents and active surveillance at scale.
I really don't understand how this is legal. I guess Facebook maybe doesn't actually have any compliance requirements in the USA, but time series screenshots of any SRE's screen are going to contain data that should not be stored by some data vacuum. I know Meta has a reputation for shitty data handling practices and US regulations are light compared to Europe, but how are they planning on securing passwords, encryption keys, PII, etc. ? Can employees turn this off at their discretion? What happens if someone forgets to turn it off before they cat the companywide ssh root private key? Even setting aside legality, someone with access to this training data would have what sounds like an unacceptably broad level of access to company systems unless Facebook wants to get hacked.
Yeah, this is crazy, remember when engineers were actually engineers and that meant something? Imagine asking to install spyware on your lawyers' firms' company laptops because you didn't trust them not to make some deal with the judge. Or demanding 24 hour monitoring on everything a doctor does because you need to review the footage at any time.
EDIT: While we are here, let's do this for politicians as well :), publicly available, auditable 24-hour surveillance.
Makes me think of how the US army trains their waterboarders: by waterboarding them first.
The goal is to manufacture a lack of empathy along the lines of: "why should I treat this person better than I was treated".
>data collected would not be used for performance assessments or any other purpose besides model training
And you expect Meta employees, of all people, to believe this?
So, back in 2021, I supervised a student project where we aimed to simulate human interaction with the browser. Obviously, we needed data on human interaction. After discussion, we ruled out collecting data from a group because:
- the project was time constrained, so hardly any time, and
- there were serious ethical questions which could never be addressed well within the allotted time for this project
So we ended up discarding the idea of collecting data from a representative group, even before we got to the point of asking "how do you do handle that ethically". We ended up collecting data from 1 subject. The student in question, indeed. He handled the data from which he derived heuristics that simulated the data. The collected data therefore never left the student's hands.
<sarcasm>Silly us, we should have just not bothered and collected it from anyone and anywhere. Apparently.</sarcasm>
In all seriousness, this callous and complete disregard for ethical questions offends me so very much.
It will be interesting to see how the people who maintain (in my opinion) one of the worst offending organizations out there for invading your privacy - and generally treating you in a manner that lacks human decency - respond to having their privacy invaded, and being treated without basic decency.
I realize you can argue whatever is done at work should have no expectation of privacy, and I get that, but as an employer myself I've always felt that schemes like keyboard and mouse tracking are going a chasm too far. Your employees are human beings not robots. In the older context of corporate productivity tracking there are far better metrics available - starting with, I don't know, maybe talking to your employee and asking them how things are going.
I wouldn't have a problem if it were opt-in, but if this were foisted upon me I would surely quit.
I like to imagine they’ll mostly capture meta employees using AIs to do work.
Then they’ll deploy models trained on this, and begin capturing employees using AIs that are good at using AIs to do work.
Repeat a few times and they’ll start capturing the keystrokes from people mashing their heads into keyboards with dispair and exclaiming, “Why can’t these models do anything anymore!!”
Next gen AI is going to become really proficient in scrolling Hacker News.
What toxic trash.
I hope this is widely hacked. If these employees are any good, someone will whip up a countermeasure that feeds absurdly wild and nonsensical data into Meta's fetid, gaping maw.
for the company that is one of the major players in tracking similar data across the web, i don't see much wrong with this.
if they continue to share their work through open releases despite the leadership change, i hope we get to benefit with their work.
not quite optimistic about the result as i wonder if on aggregate we all consistently interact with computers the most efficient way possible. maybe to beat captcha or scraper detection through mimicry perhaps.
Now this is some hacker news.
We’ve been moving towards a more and more tyrannical company controlled society for a long time and now they’re straight up doing hacking tactics to train machines to take our jobs. Doesn’t get much more bleak than that.
I'm so happy that EU and UK have laws against this kind of thing and so I will still be able to work somewhere in the future(TBD what future means, though).
Yesterday I was doomscrolling computer vision related stuff on LinkedIn. I hate it, but often looking for freelancing ops in CV. A video appeared in my feed of some South Asian laborers sewing in a garment factory. All of them had cameras mounted on their heads. Otherwise, they looked exactly like you’d imagine beleaguered sweatshop workers would look. Exhausted, dull expressions looking into the camera as whoever was filming the video walked by.
The presentation of the video and all the comments were on awesome cool ego-centric video understanding research that’s going to totally obsolesce human labor. I couldn’t get over how grim the video was. Here are some people in one of the least desirable positions in the world, and that’s not enough. Now they must labor without a shred of dignity, knowing they’re training their own replacements and likely not a thing they can do about it.
I’ve struggled to find enough freelance work to stay busy recently, but more than that I’m starting to feel a moral crisis. It’s getting harder and harder for me to feel like what we’re collectively doing isn’t absolutely fucked.
Original source: https://www.reuters.com/sustainability/boards-policy-regulat...
It seems like every tech company is moving towards the sweatshop model pioneered by CrossOver/Trilogy, treating engineers as human CPUs at best, monitored 24/7.
For context, when the article says "a list of work-related apps and websites," this includes Google properties like gmail, docs, etc, and social media websites like Facebook and Instagram, with no provision for excluding personal accounts.
I bet this doesn't include higher management. Wonder why.
For those saying that this is fine because company computers are company property...
This is like going to work in a drug-lab where everyone is required to strip naked to ensure no "product" can be smuggled out. It's a zero trust environment at first blush, with the added terror of it being used to replace you with AI.
People working naked in a drug lab have more job security than meta employees and an equivalent level of respect and trust from their employer. However, they can't unionize because they have no legal protections. Their employer could literally point a gun at them if they complained. That isn't the case for Meta employees. Just sayin'.
Growing up we learned about _Slaughterhouse 5_ and _Cat's Cradle_ by Kurt Vonnegut. But there's not enough discussion or awareness of _Player Piano_. Incredibly prescient. These kinds of dystopic headlines are exactly the kind of thing you'd see in the book.
UNIONIZE. If it’s not obvious to you now, it never will be.
Meta going all in on their brand with this.
Someone had to do it, distasteful though it may be. Could be quite hilarious what it learns in the process.
Honestly, I doubt this data is as useful as they think.
Half my workday is me browsing random tabs while an AI agent does the actual work. They're going to train a model on alt-tabbing and scrolling HN/Twitter/Reddit.
If it is available for training, I assume it is available for discovery.
Do most people who work in AI companies realize that if this buildup of reasoning models succeeds at what every tech CEO is aiming for, all of them will be out of a job?
So happy I decline to even start the Meta interview cycles. The company seemed ridiculous even back then, but this is next level.
Do they get each own Meta ray ban grasses as well that they have to wear at all time even in bathrooms?
Because ends justify means. To quote Boz himself:
“ The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is de facto good. It is perhaps the only area where the metrics do tell the true story as far as we are concerned.”
How is this supposed to improve productivity? I'm still struggling with the framing of the business productivity gained from this?
I will say that I feel for the folks who work at Meta...I can't help but to feel they have long jumped the shark.
this would be a good time for Meta employees to reconsider their life choices.
This is how anthropic captured the code agent so fast. You need training data, users are giving it to you.
Being a terminal application, all interaction is trainable signal (unlike, say, cursor, which is an IDE and let users freely explore, edit the files, move the mouse. Model sees nothing of it, nothing to train upon).
So meta is doing the obvious, we want to train a computer use model, we need training data. Better to capture from employee than buying low quality data.
https://archive.is/TYcpI
"start"
They 'trust me'. Dumb f*ks.
At this rate they're not going to need to do layoffs.. nobody sane is going to want to work there.
Everybody will be a serf under technofeudalism.
Ironically, I’d be surprised if this wasn’t already the case before? I recall vividly employment contracts with meta in 201X with a clear mention that employees were giving up any sense of privacy while using meta provided devices or entering meta’s premises…
What would be the wxact opposite of Meta as a company? Small, privacy focused, HN blacklisted at work? Am I missing something?
Relevant story (Manna)
https://marshallbrain.com/manna1
Could a few of the smart people there please find a way to poison this data set? Before something similar ends up on my work computer.
There is a danger in this. Small companies with delusional people in them will see "how the big guys do it" and try to apply this kind of thing in their own little fart of a business, making our dev/engineer lives miserable.
Why do we allow this?
I guess this is why they acquired https://www.limitless.ai/ ?
I wonder if there's a market for a little usb fob that does nothing but meander the mouse cursor about the screen in a path that, upon proper rendering, would appear to be a ...
I'm so excited to interview for a career at Meta!
Also, why are the investors not suing the legs off of Zuck for the whole meta verse debacle? It is a scam and pure fraud. Also dumb name, sue for that too. Should have just renamed it meeme.
Meta employees to start using AI to make fake mouse movements, keystrokes while goofing off.
If anyone still has not watched Severance, it is good time to start watching that show!
Taking dystopia aside, without a lot more context I don't quite get how the captured data will be particularly useful to train models for say software engineering. If someone can shed light - thanks!
[dupe] https://news.ycombinator.com/item?id=47851242
[dupe] https://news.ycombinator.com/item?id=47851086
Now that the early 10s dev worship era is officially over, all pretensions of "making the world a better place" and being nice have been dropped and devs shall remember what it feels like to be a replaceable cog that can be swapped the way we used to do with phone wallpapers.
I wonder if this screen + mouse + keyboard (+ camera + speaker + mic) interface is really the right level of abstraction to model a “digital entity”
Sure, you can do everything a human can, but it also seems VERY inefficient
As an alternative, maybe you could just do network in/out?
With this data, meta can make metahumans which pass recaptcha for real.
It will be funny when the AI learns to browse Reddit and watch porn during the work day.
Maybe this is exactly why Meta poached Alexandr Wang. Data capturing is an heirloom technique passed down from his Scale AI days
The irony of this is so strange..
fines are negligible for these companies, so i also expect these policies to be applied to eu employees without telling them
Honest question, does most of Meta's creepiness trickle down directly from Zuckerberg, or is their entire executive also this creepy?
Does the executive know better at this point but have toasted the culture and no one can fight against it anymore?
Microsoft was also doing the same in their VIVA program.
Seems like Skan AI's solution. They have a few Fortune 500 companies as clients doing exactly the same thing as Meta - capturing keyboard and mouse clicks to ultimately do next level process automation.
Another alt link: https://www.businessinsider.com/meta-new-ai-tool-tracks-staf...
Meta is like Big Brother in the novel 1984 now.
If you then think of crazy companies such as Palantir, something really has to be done about those entities. As a first step I suggest disbanding those companies, for many reasons, including wrong ethics.
I definitely see a strong demand for a 11” 1kg macbook with these policies inevitably spreading.
And here I am rejecting projects because I refuse to install on my computer closed sourced Chinese VPN my client is requiring, though I told them I could just use built-in Windows VPN or open source Hiddify.
Btw do they at least pay them extra for this spying or is it supposed to be for free? I mean if they paid at least 30-50% on top of the salary maybe I wouldn't mind doing it on dedicated meta computer.
Time to leave tech forever and farm onions, I think.
Gotta feed the beast some how.
how to cause a mass exodus with this one simple trick!
People are just being misused to train their own replacement.
Always thought Meta was a god awful run company and this just brings home the cake
I can’t imagine being mad that the data collection company that I work for now wants data on _me_
Really though it seems reasonable to me. They want data to train AI, and their employees are obviously a large source.
They could already track your every click. They have root on your work MacBook. Most employers do.
I can't imagine a more useless dataset to collect, proving that Meta might have reached the peak of the graph of (reach/grasp)/time and the numerator is about to plummet spectacularly.
Eventually every word spoken as well, which is already the case for most meetings, but not yet for individual interactions. Every bit of information at companies will be accessible to AI. This will allow automation all the way up to the C suite.
Probably aren’t seeing the promised productivity improvements of AI in terms of shipping production code and not just “super demos” that aren’t robust. So they want to see if the withers are really putting in the time or if the models struggle past a level of complexity that stalls or reverses early gains.
I mean - uh - gotta find all the signals that may exist.
I...admire the diligence
The engineers who implement this should be ashamed of themselves.
1984 level sickening.
Meta can even afford to destroy themselves and their own employees.
More proof that they do not care about you at all. This is Meta's way of moving fast and destroying everything at all costs.
Fucking insane.
Optimizing ourselves to death.
Capitalism is asleep at the wheel with its foot stuck on the gas pedal.
Hey fellow engineers.
I know you've long been hypnotized by libertarianism and the cult of the individual.
Maybe it's time you reconsider in light of the overwhelming evidence that the capitalist class is, in fact, not your friend.
The only known way for workers to assert their rights is collective action. Alone, you are weak and replacable. Together, we are strong.
It's time for a proper tech worker's union, to give us some fangs to claw back our dignity with.
Zuck's a sociopath
Data collection isn’t new. The training is.
But this is a good thing. Let me explain. Imagine a society where an individual’s rights are prioritised and where society is dedicated to the best interests of each citizen (not desires or wants but reasonable considered best interests)
Now imagine a society where your individual daily actions are recorded, reviewed and helpfully advised upon.
Millions of people making millions of actions each day and all recorded compared and sifted for positive feedback and improvement overall.
Just how far ahead would such a society pull compared to one that stays at today’s level. Compared to one that used totalitarian methods enabled by such surveillance?
The difference between Soviet and Western Europe was not the tech, it was the trust.
If we can build a society with f trust then this tech will turbo charge us.
If …