Traster

I think this is... fine? Am I just totally naive. I think it's fine to say "You don't really have privacy on this app" - as long as there are relatively good options of apps that do have privacy (and I think there are). TikTok is really a public by default type of social media, there's not much idea of mutual following or closed groups. So sure, you don't have privacy on tiktok, if you want it you can move to snapchat or signal or whatever platform of your choice.

Like, it's literally a platform that was run under the watchful eye of the CCP, and now the US version is some kleptocratic nightmare, so I just don't see the point in expecting some sort of principled stance out of them.

In some ways I think it's worse for places like Facebook to "care about privacy" and use E2EE but then massively under-resource policing of CSAM on their platform. If you're going to embrace 'privacy' I do think it's on you to also then put additional resources into tackling the downsides of that.

show comments
xeckr

Brilliant. They're repackaging the argument governments have long made about E2EE being dangerous to children.

show comments
ThoAppelsin

DMs are akin to private conversations in real life. Thus, every DM feature should entail E2EE.

It’s ok for a platform to not feature private conversations. They should just have no DM feature at all, then; make all messages publicly visible.

Private conversations are indeed not for all ages. Parents should be able to grant access to that on individual basis.

show comments
ranyume

This might be off-topic but on-topic about child safety... but I'm surprised people are being myopic about age verification. Age verification should be banned, but people ignore that nowadays most widely used online services already ask for your age and act accordingly: twitter, youtube, google in general, any online marketplace. They already got so much data on their users and optimize their algorithms for those groups in an opaque way.

So yeah, age verification should be taken down, as well as the datamining these companies do and the opaque tunning of their algorithms. It baffles me: people are concerned about their children's DMs but are not concerned about what companies serves them and what they do with their data.

show comments
computerex

TikTok is a front for government surveillance, so it's not really surprising that this is their position.

show comments
MetaWhirledPeas

"makes users less safe"

They don't believe that. It makes it more difficult to deal with governments, is all. Big Brother needs your messages from time to time, and TikTok is not willing to risk getting shut down to argue against that. We can't have pesky principles getting in the way of money.

gorgoiler

I don’t really understand how we are supposed to believe in e2ee in closed proprietary apps. Even if some trusted auditor confirms they have plumbed in libsignal correctly, we have no way of knowing that their rendering code is free of content scanning hooks.

We know the technology exists. Apple had it all polished and ready to go for image scanning. I suppose the only thing in which we can place our faith is that it would be such an enormous scandal to be caught in the act that WhatsApp et al daren’t even try it.

(There is something to be said for e2ee: it protects you against an attack on Meta’s servers. Anyone who gets a shell will have nothing more than random data. Anyone who finds a hard drive in the data centre dumpster will have nothing more than a paperweight.)

show comments
ronsor

Why would you use TikTok for private communications anyway? It's mostly a public short video sharing platform.

show comments
swiftcoder

> the controversial privacy feature used by nearly all its rivals

"controversial" according to who? The NSA / GCHQ?

show comments
beaker52

Someone in the UK government is furiously writing this down.

hexage1814

It doesn't matter. Web-based cryptography is always snake oil

https://web.archive.org/web/https://www.devever.net/~hl/webc...

show comments
cdrnsf

TikTok and other social media apps' business models are antithetical to privacy.

show comments
zzo38computer

In my opinion, a separate software should be used for the end-to-end encryption than for the communication, although there are other things to do for security other than only programming the computer correctly (such as securely agreeing the keys and ciphers in person).

matricaria

Since when is E2EE controversial? Not using E2EE should be controversial.

show comments
dlev_pika

lol

It makes sense - they extract every possible bit of personal information from your device - why would they make you believe they care about your privacy?

You want to communicate privately? TikTok is not the place, and that’s ok. shrugs

maxdo

People seriously discuss privacy in Chinese app . With all respect, their government will not allow you even a hint of privacy

pothamk

The core tension here isn’t really about encryption itself, it’s about moderation models.

Most large platforms rely heavily on server-side visibility for abuse detection, spam filtering, recommendation systems, and safety tooling. End-to-end encryption removes that visibility by design. Once a platform is built around centralized analysis of user content, adding strong E2EE later isn’t just a feature toggle — it conflicts with large parts of the existing architecture.

maest

Do you feel safer knowing DMs are not encrypted?

show comments
gradientsrneat

A middle ground would be to implement E2EE but have messages signed (and ideally organized in a Merkel tree), so that if a DM is reported there's cryptographic proof that the accounts sent the messages.

krickelkrackel

Just like door locks are making the world less free!

lucasfin000

I dont think the argument is really about child safety. If it was tiktok would also be working on fixing their algorithm that can send minors toward harmful content, which is a far larger documented vector than encrypted DMs. This is about preserving access.

matesz

Fun fact - there is a big correlation between World Wars and compulsory education. Of course governments and big corporations "care" about children. Of course!

sheept

I feel like this makes sense for a platform that targets teens. Plus, I wouldn't trust TikTok to implement E2E encryption properly—who knows what they've snuck into their client.

show comments
Schlagbohrer

This BBC article is insanely written.

blackqueeriroh

There is no way to do E2EE on a traditional social media platform with user-generated content and comply with existing US law.

You can’t moderate an E2EE platform.

show comments
_el1s7

That's good, people who need E2EE shouldn't use TikTok either way, there are plenty of other secure apps for that.

TikTok is a social media app, and it gets heavily abused as it is.

zthrowaway

Making users less safe from… letting us snoop on all your communications for “national security”.

0xbrayo

unrelated but I'm always surprised by the number of people who don't know that instagram dms are not encrypted by default.

2OEH8eoCRo0

What unsafe things are users most likely to encounter?

1970-01-01

I see it like this: Taking in the totality of the danger, they're right. If the source (social network) and the destination (child brain) cannot be treated as trustworthy, then you must control the content for overall safety. If you could trust either end, then you could dismiss the argument. But you cannot trust children to be cognizant of abuse, and you already know social media literally reinvented abusive behaviors for the 21st century. Do nothing and children will be harmed. Overreach by any amount and you have destroyed freedom. The only middle ground is weaker encrypted E2E comms. Something that creates a forcing function with very high cost (an electric bill or SaaS service) for the sniffer but can be broken with enough horsepower. Think about what millions of dollars per character would do. Good luck codifying that insane compromise into a law.

gnarlouse

Maybe just don't use TikTok. Shocking that adults use a platform for children.

lwansbrough

The Chinese spyware app won’t do E2EE? I can’t believe what I’m reading.

show comments
dev_l1x_be

I take privacy suggestions from social media companies on a daily basis.

SuperSandro2000

hahaha, good one

9864247888754

And their target audience won't question it.

insane_dreamer

I'll never let my kids have a TikTok account anyway (once they're adults they can have one of course if they want to).

edarchis

> But critics have said E2EE makes it harder to stop harmful content spreading online, because it means tech firms and law enforcement have no way of viewing any material sent in direct messages.

Like they give a damn. I report accounts that explicitly sell fake credit cards, citing laws that make it illegal and 95% of the time "we checked and there is no violation here, we know that you're not happy but don't give a crap".

So the argument of security is utter bullshit and they just want to snoop.

bas

Fascinating. What a time to be alive.

hd4

I hate the BBC so much - "controversial privacy tech" "E2EE ... the best way to protect conversations from .. even repressive authorities" "End-to-end encryption has been criticised by governments, police forces"

They're saying this at the same time as they're clutching pearls over Iran's repression of protestors. Typical of the ethical consistency I would expect from them.

tw04

Reminder, Larry “citizens shouldn’t get any privacy” Ellison now owns tik tok. If you’re still using it or have friends and family using it you should stop immediately. It WILL eventually be used against you if this regime gets its way.

https://digitaldemocracynow.org/2025/03/22/the-troubling-imp...

show comments
iso1631

The actual headline is currently

> TikTok won't protect DMs with controversial privacy tech, saying it would put users at risk

Not sure if this was changed since first posting, I don't mind updates, but unless it'd redacting for legal purposes (which should then itself be clearly mentioned), the BBC should provide a public changelog like wikipedia

crest

A Chinese company saying you don't need encryption. Why should anyone waste time debunking their bad faith "arguments"?

camillomiller

Doublespeak. War is peace.

Tyrubias

TikTok’s stance against end-to-end encryption is unsurprising but still concerning. TikTok is a source of information on many topics, such as the genocide in Gaza, which traditional media underreport and many governments try to suppress. The network effect of big social media platforms means many people will likely talk about these topics in TikTok DMs. No matter what legal controls TikTok claims to enforce, there is no substitute for technological barriers for preventing invasions of privacy and government overreach. This is yet another example where corporations and governments sacrifice people’s autonomy and privacy in the name of security.

show comments
burnt-resistor

It's the Max app for Americans, now with 900% more US and IL government spying.

blueTiger33

so we need no encryption?...at the end of the day we have nothing to hide right CIA,FBI? :D

rdiddly

"The situation is made more complex because TikTok has long faced accusations that ties to the Chinese state may put users' data at risk."

And yet, it's even more complex than that, since it's now owned by cronies of the current US President. I've never had a TikTok account, but conceptually I was mostly pretty okay with being spied-upon by China. I'm never going to China.

show comments
quotemstr

It's one thing to make a policy decision I disagree with. It's another to lie, blatantly, to my face about it. But what do you expect from people who bought TikTok specifically so they could add censorship and lied about it being some kind of national security issue?

knodi

Another step in towards the endgame, mass surveillance state.

Madmallard

clown emoji

villgax

This according to many researchers is the best case study example for corporations gaslighting users into accepting surveillance by companies and governments alike.

croes

> Grooming and harassment risks are very real in DMs [direct messages] so TikTok now can credibly argue that it's prioritising 'proactive safety' over 'privacy absolutism' which is a pretty powerful soundbite

Means they read every message

show comments
animitronix

lol why are people still using this garbage?

deafpolygon

why are we still wringing our hands around this? we’ve already determined that tiktok is bad for our health.

because tiktok is addicting, and they know it…

Bud

BBC calling encryption "controversial privacy tech" is deeply disappointing and dangerous.

show comments
bsza

> We know just how risky end-to-end-encrypted platforms can be for children

As opposed to doomscrolling and brainrot, which are not risky to expose children to at all. /s

If TikTok cared about children in the slightest, they would not exist.