I think this is... fine? Am I just totally naive. I think it's fine to say "You don't really have privacy on this app" - as long as there are relatively good options of apps that do have privacy (and I think there are). TikTok is really a public by default type of social media, there's not much idea of mutual following or closed groups. So sure, you don't have privacy on tiktok, if you want it you can move to snapchat or signal or whatever platform of your choice.
Like, it's literally a platform that was run under the watchful eye of the CCP, and now the US version is some kleptocratic nightmare, so I just don't see the point in expecting some sort of principled stance out of them.
In some ways I think it's worse for places like Facebook to "care about privacy" and use E2EE but then massively under-resource policing of CSAM on their platform. If you're going to embrace 'privacy' I do think it's on you to also then put additional resources into tackling the downsides of that.
Tiktok has private messaging, and it is used by hundreds of millions of people.
IMO no consumer service should have private 1:1 messaging without e2e. Either only do public messaging (ie. Like a forum), or implement e2e.
Tiktok has direct messages, they don't even call them private.
It's better that they're honest about this, nobody should believe for a second that WhatsApp or FB messages are truly E2EE.
DM on social media shouldn't be used for anything remotely private. It's a convenience feature, nothing more.
> nobody should believe for a second that WhatsApp or FB messages are truly E2EE.
Meta still tracks analytics which isn't good for privacy, but I'm not aware of any news of them or 3rd parties reading messages without consent of one of the 1st parties? Signal is probably much better though
> Meta still tracks analytics which isn't good for privacy, but I'm not aware of any news of them or 3rd parties reading messages without consent of one of the 1st parties? Signal is probably much better though
Correct. WhatsApp uses the Signal protocol, and there is zero evidence of them reading message contents except with the consent of one of the users involved (such as a user reporting a message for moderation purposes).
(And before anyone takes issue with that last qualifier, consent from at least one party is the bar for secure communications on any platform, Signal included. If you don't trust the person you are communicating with, no amount of encryption will protect you).
Discovering a backdoor in WhatsApp for Facebook/Meta to read messages would be a career-defining finding for a security researcher, so it's not like this is some topic nobody has ever thought to investigate.
>I'm not aware of any news of them
Yet. Until they say "We delete these messages after X time and they are gone gone, and we're not reading them" Assume they are reading them, or will read them and the information just hasn't got out yet.
I mean we keep finding more and more cases where companies like FB and Google were reading messages years ago and it wasn't till now we found out.
> We delete these messages after X time
They never had the plaintext of the messages in the first place, so they don't need to delete them. That's what end-to-end encrypted means.
Way to dunk on OP I guess but nobody is playing semantics here, it's just whether people think this is a messaging channel with one intended recipient
> Tiktok has direct messages, they don't even call them private.
It may not be called that, but what are users expecting? Some folks may later be surprised when a warrant gets issued (e.g., from a divorce judge).
If you are a grown adult and dont do research on “messaging apps” (which Tik Tok is not) then thats really on you.
This viewpoint isn't a slippery slope, it's a runaway train.
"You moved into a neighborhood with lead pipes? That's on you, should have done more research" "Your vitamins contained undisclosed allergens? You're an adult, and it didn't say it DIDN'T contain those" "Passwords stolen because your provider stored them in plaintext? They never claimed to store them securely, so it's really on you"
Legislating that everyone must always be safe regardless of what app they use is a one-way ticket to walled gardens for everything. This kind of safety is the rationale behind things like secure boot, Apple's App Store, and remote attestation.
Also consider what this means for open source. No hobbyist can ship an IM app if they don't go all the way and E2E encrypt (and security audit) the damn thing. The barriers of entry this creates are huge and very beneficial for the already powerful since they can afford to deal with this stuff from day one.
this isn't anything new, however. No messaging has been actually private since forever, that's why encryption was invented. To keep secrets and to pass those secrets in a way that can be observed without revealing the secret.
Telephones can be tapped, people sold special boxes that would encrypt/decrypt that audio before passing it to the phone or to the ear. Mail can be opened, covertly or not. AIM was in the clear (I think at one point, fully in the clear, later probably in the clear as far as the aol servers were concerned)...
Unless the app/method is directly lying to users about being e2ee it's not a slippery slope, it's the status quo. Now there are some apps out there that I think i've seen that are lying. They are claiming they are 'encrypted' but fail to clarify that it's only private on the wire, like the aim story.. the message is encrypted while it flys to the 'switchboard' where it's plain text and then it's put wrapped in encryption on the wire to send it to the recipient.
The claim here that actually makes me chuckle is somehow trying to paint e2ee as 'unsafe' for users.
If you are a grown adult and don't do research on "<insert any topic that could have a material negative impact on your life, but that is not currently on your radar as being a topic that could have a material negative impact on your life>" then that's really on you.
Unfortunately, this doesn't scale.
Well it does scale… just not in the way that is good for democracy.
80% of the population does not and will never do that level of deep dive on apps
same discussion for any form of technology be it TVs or changing their car's oil
the deliberate app-store-ification of all things computer is also designed to keep people from asking those questions -- just download in and install, pleb.
it's why the Zoomers can't email attachments or change file types: all of the computers they grew up with were designed so they never had to understand what happens under the hood.
And I think because of all the handholding we are left worse off.
Honestly I'm tired with every app trying to become the everything app.
Now TikTok wants to be a messaging app. Snapchat has a short video feed just like TikTok. WhatsApp only has a text feed, how long until they also add a video feed?
Meta already has video feeds in facebook and instagram though, I imagine they wouldn’t want to detract users from those
> nobody should believe for a second that WhatsApp or FB messages are truly E2EE
That's interesting. You think all firms that audited WhatsApp and Signal protocol used by WhatsApp and all programmers who worked there for decades and can see a lie and leak if it was true are all crooks? valid opinion I guess, but I won't call it "no one should believe for a second
(curious you didn't mention Telegram, it is actually marketed as secure and e2e and it has completely gimped "secret chats" that are off by default and used by like almost nobody.)
I forget if its WhatsApp that technically lets you sync chats in unencrypted form to iCloud which is the “loophole” around this, though you can lockdown your iCloud even tighter, not sure it Apple can do much if you fully lock down your iCloud, not sure if this has been legally tested? Its not a very advertised feature its just a setting.
iCloud backups are encrypted, and can be end-to-end encrypted.
Also, backups have nothing to do with the messages being end-to-end encrypted. Like if you don't use a passcode on the phone, the messages are still encrypted.
WhatsApp iPhone syncs to iCloud unencrypted by default[1].
iMessage also syncs to iCloud unencrypted by default[2].
[1] Depends on you paying for iCloud storage, so that you have space for a full phone backup to occur.
[2] Might be "free" with "iMessage in iCloud", an option to enable separately.
> WhatsApp iPhone syncs to iCloud unencrypted by default[1].
Not true. You must choose to enable it or not when you set up new phone. On mine it does not back up
If you must "choose to enable" encryption, that implies it's off by default. If so, GP's statement is accurate.
Choose to enable backups.
No, I mean you must select yes or no. can't use WhatsApp until you make a choice yourself.
The Android version syncs all your chat logs to Google Drive without encryption by default. That's the backdoor.
Right now it got a switch to enable e2e for backups, but yeah I think default backup is probably a workaround...
I'll believe it when it's FOSS
You mean you will read all code with dependencies and compile it yourself to make sure?;) good for you. but good luck creating a popular e2e messenger then.
In my experience most forums have private messaging.
Additionally I think it is fine to say "we don't support e2ee". I prefer honesty to a bad (leaky) e2ee implementation, at least the user can make an informed choice.
I agree. At least take of "Yes messages are stored on our servers" is honest. And if they are accessed by anything else than limited subpoena is policy or legal issue.
>In my experience most forums have private messaging.
Yeah but it's kind of accepted that the forum owner could read it all if they so chose. Maybe this is a hold over from back in the old days when encryption was nowhere near default during which forums arose.
Adding that private self hosted forums can permit uploads of encrypted files, encrypted with a pre-shared secret or a secret shared over a private self hosted Mumble voice chat server.
And yet virtually all consumer services with 1:1 messaging lacks e2e. This is a bit of a quixotic position to take.
The email protocols would like to have a chat with you.
You can bring your own encryption to that, and bring your own client to automate it.
you can encrypt the content but not the metadata, not even the subject unless you use a customized client that encodes it (like deltachat which doesn't use a subject at all), but then you still have your email address exposed.
for all intents and purposes email is not e2ee.
Email encryption for most people is sufficient even if the metadata is exposed. One can simply state in their email encryption "Bing Bing Bong" or "Why did you not put the trash out?" which might mean to the recipient :: "check the second SFTP server" or "let the cat outside" or "Jump on my private Mumble chat server" or "Get on my private self hosted IRC server". The email message need not be encrypted for that matter.
The intended payload can be in an header-less encrypted file on a throw-away SFTP server in the tmpfs ram disk.
So it's end to end encrypted except that third parties can see who you communicated with and when? Sure.
Exactly.
I have never considered metadata a part of the term E2EE. It has always been about the message contents.
I understand that metadata is valuable information for spies/governments and that encrypting or hiding it is valuable for privacy. But if you use that definition, there are almost no E2EE protocols on the planet in use.
First and foremost, any protocol that uses Apple or Google push notifications is giving metadata to those organizations. Even Whatsapp, iMessage, Signal, Telegram private messages, all of that leaks metadata but the contents of messages are hidden from the provider.
yeah bro genius, that sounds like a totally actionable thing people will do all the time with email. Be sure to drink your ovaltine
yeah bro genius
I know, right? I admit that is mostly for people on Linux desktops. People on smart phones are 100% monitored regardless of encryption or fake E2EE that platforms pinky promise is really E2EE like Signal. Shame on Moxie, he knows better.
Ovaltine has a crapload of sugar. Don't drink that horse piss.
I can bring my own encryption to tiktok as well. Has roughly the same usability and usage.
you can bring your own encryption to ANY messaging platform, doesn't mean it will be easy to use. e2ee just really makes it handy so that users don't need to preshare any keys.
> as long as there are relatively good options of apps that do have privacy (and I think there are)
Once you have enormous network effect like TikTok has, you don't really have any free selection of alternative apps. You are free to use one, but you will be the only sad user over there.
Regulations are needed that would force large platforms like TikTok and Instagram to enable federation, opening them up to actual competition. This way platforms would be able to compete on monetisation and usability, instead of competing on locking in their precious users more strictly.
“Will we ever end the MySpace monopoly?”
> MySpace is well on the way to becoming what economists call a "natural monopoly". Users have invested so much social capital in putting up data about themselves it is not worth their changing sites, especially since every new user that MySpace attracts adds to its value as a network of interacting people.
> "In social networking, there is a huge advantage to have scale. You can find almost anyone on MySpace and the more time that has been invested in the site, the more locked in people are".
https://www.theguardian.com/technology/2007/feb/08/business....
>Regulations are needed
Lolololol. No, not regulations. Regulators. With the people we currently have voted into office in the US the only regulations we are going to get are ones saying Sam and Peter must look at everything you do all the time.
Until we stop voting for more authoritarianism, expect ever increasing amounts of authoritarianism.
federation would never work. How would it work here? Either you are forcing tiktok to give pageviews to federations of spam, or you are letting tiktok decide which federations to work with, which essentially results in no federation.
I am fine TikTok remaining that 'we watch what you are doing' platforms. Those do not care can gave that if they wish, I do not mind.
But bullshitting about it is making users more safe, that is ... bullshit! Worse that that, distorting public opinion, intentionally fooling the gullible.
That it’s fine because it’s the CCP (commies see all) is a new one.
It’s at best subpar for the same reasons as if it was the usual Silicon Valley spyware.
I could leave well enough alone. But why? Because there are choices? There are five other brands of cereal that do not have 25% sugar? I’d rather be a negative nancy towards these on-purpose addictive, privacy-leaking attention pimp apps.
It might be fine if they presented an honest choice.
They are lying straight off though... police and safety team don't read messages only "if they needed to" to keep people safe. They do so for a large variety of other reasons, such as suppressing political dissent and asserting domination and control.
I don't think we can expect most people to understand TikTok's BS here either. I notice even a skeptic like you is uncritically echoing the dubious conflation of privacy and CSAM.
Anyone who doubts the requirement for e2e messaging should not be considered a skeptic, they are fully buying into whatever narrative LEO would like you to believe.
Fine with me too. I think many other apps (WhatsApp, FB, etc.) are using E2EE for PR purposes and are not actually good implementations of E2EE.
Good implementations of E2EE:
1. Generate the key pairs on device, and the private key is never seen by the server nor accessible via any server push triggered code.
2. If an encrypted form of the private key is sent to the server for convenience, it needs to be encrypted with a password with enough bits of entropy to prevent people who have access to the server from being able to brute force decode it.
3. Have an open-source implementation of the client app facilitating verifiability of (1) and (2)
4. Permit the users to self-compile and use the open-source implementation
If company isn't willing to do this, I'd rather they not call it E2EE and dupe the public into thinking they're safe from bad actors.
No, saying that e2e encryption makes users _less_ safe is completely dishonest, nothing is fine about this.
The logic of "anything is better than before" is also fallacious.
Depends on your definition of "safe". Imagine an adult DMs a nude photo to a minor (or other kinds of predation).
If it's E2EE, no one except the sender and receiver know about this conversation. You want an MITM in this case to detect/block such things or at least keep record of what's going on for a subpoena.
I agree that every messaging platform in the world shouldn't be MITM'd, but every messaging platform doesn't need to be E2EE'd either.
The receiver has a proven and signed bundle, that they can upload to the abuse report. So the evidence has even stronger weight. They can already decrypt the message, they can still report it.
Yes, but this leaves the only way to identify this behavior as by reporting from a minor. I'm not saying I trust TikTok to only do good things with access to DMs, but I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encrypted.
I'm not saying no E2E messaging apps should exist, but maybe it doesn't need to for minors in social media apps. However, an alternative could be allowing the sharing of the encryption key with a parent so that there is the ability for someone to monitor messages.
> I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encrypted
Would it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant? People are paranoid about this sort of thing not because they think law enforcement is more effective when it is constrained. But how easily crimes can be prosecuted is only one dimension of safety.
> However, an alternative could be allowing the sharing of the encryption key with a parent
Right, but this is worlds apart from "sharing the encryption key with a private company", is it not?
> Would it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant?
Police can access your home with a warrant.
Police cannot access your E2EE DMs with a warrant.
Not answering my question!
> Police cannot access your E2EE DMs with a warrant.
They can and do, regularly. What they can't do is prevent you from deleting your DMs if you know you're under investigation and likely to be caught. But refusing to give up encryption keys and supiciously empty chat histories with a valid warrant is very good evidence of a crime in itself.
They also can't prevent you from flushing drugs down the toilet, but somehow people are still convicted for drug-related crimes all the time. So - yes, obviously, the police could prosecute more crimes if we gave up this protection. That's how limitations on police power work.
> But refusing to give up encryption keys and supiciously empty chat histories with a valid warrant is very good evidence of a crime in itself.
Uh, it absolutely isn't? WTF dystopian idea is this?
And they shouldn't be able to. Police accessing DMs is more like "listening to every conversation you ever had in your house (and outside)" than "entering your house".
>Police cannot access your E2EE DMs with a warrant.
Well the kind of can if they nab your cell phone or other device that has a valid access token.
I think it's kind of analogous to the police getting at one's safe. You might have removed the contents before they got there but that's your prerogative.
I think this results in acceptable tradeoffs.
Yes, that is a fair argument and most countries allow the use of surveillance cameras in public for this reason.
SimpleX handles this by sending the decryption keys when the receiver reports the message.
Keeping children safe and prosecuting are too different concepts, only vaguely related. So no, being able to track pdfs doesn't make children safer. What keeps them safe is teaching them safe communication habits and keeping them away from things like Tiktok.
We shouldn't make the world a worse place for every one because some parents can't take care of their children.
>Keeping children safe and prosecuting are too different concepts, only vaguely related.
See also: That time the FBI took over a CSAM site and kept it running so they could nab a bunch of users.
Not necessarily saying what they did was right, but I think there's a strong utilitarian argument to be made that what they did in that case was, in fact, the best way to keep children safe.
What's more dangerous? CSAM on the internet? Or actual child predators running loose?
That stuff spreads and re-spreads just like anything else people download off the internet. There's a pretty strong argument for shutting it down right away. IIRC most users were outside jurisdiction.
Even if one more person was prosecuted it was worth it. If you shut down an illegal website a new one will show up a month later, with the same people involved, and you achieved nothing.
What was the rate of child exploitation in the GDR?
Ugh. The kids aren't even safe from the people making, and enforcing laws. This argument should be long over for anyone with eyes or ears.
Imagine Hamas are your government and want to figure out who's gay. You don't want a MITM in case they can do this.
Pick your definition of safe.
In that case don't use Tiktok dm's to discuss your sexuality. I think it is strange that people feel like they have to be able to talk on sensitive topics over every interface they can get their hands on.
Similarly in "traditional" media you may not want to discuss such private conversation on a radio broadcast. Perhaps you would rather discuss it on the phone or over snail mail as there is more of an expectation of privacy on those medium.
Right, but it currently isn't a sensitive topic - homosexuality is, as of 2026, broadly legal in the United States. That's a relatively new state of affairs, historically speaking, and one which Afghanistan shared as recently as 2021.
I'm commenting in the context of the conversation, not in a vacuum. You could just as (in fact, much more) easily say that children shouldn't be on apps with private messaging enabled. That would help a lot more, and then we could keep e2ee.
> there is more of an expectation of privacy on those medium
What does the "p" in "pm" stand for?
excuse me, I confused "Private messages" (pm) for "Direct messages" (dm).
I will update above
I don't think you confused anything, except for the terminology the platform uses. There is an obvious expectation of privacy when sending direct messages!
Hasn't been true ANYTIME IN HISTORY. Hell it was well understood even by children that no conversation you had on the telephone was truly private. That's why cyphers were invented.
it stands for "not a public timeline post"
It should be obvious from how contrived your wording is that nobody thinks of them this way.
This is fine if you have TLS encryption and the platform is not local.
Sure, they can fabricate some evidence and get access to your messages, in which case, valid point.
It makes certain users less safe in certain situations.
E2E makes political activists and anti-chinese dissidents safer, at the cost of making children less safe. Whether this is a worthwhile tradeoff is a political, not technical decision, but if we claim that there are any absolutes here, we just make sure that we'll never be taken seriously by anybody who matters.
Claiming e2e makes children less safe is flat out dishonest. And the irony of you criticising “absolutes” after trying to pass one is just delicious.
What are children at risk of, when E2EE is used?
What are children at risk of, when E2EE is not used?
> What are children at risk of, when E2EE is used?
Potential exposure to abusive adults.
> What are children at risk of, when E2EE is not used?
State-sanctioned violence.
This is the argument they can’t have…
well having no e2e encryption is safer than having a half-baked e2e encryption that have backdoor and can be decrypted by the provider.
and for tiktok's stance, I think they just don't want to get involved with the Chinese government related with encryption (and give false sense of privacy to user)
Trying to gaslight the public into thinking end to end encryption makes users less safe is not fine.
>I think it's fine to say "You don't really have privacy on this app"
Disagree. To analogize why: privacy isn't heated seats, *its seat belts*. Comfort features and preferences are fine to tailor to your customers and your business model. Jaguar targets a different market than Ford, and that's just fine.
Safety features should be non-negotiable for all. Both Jaguar and Ford drivers merit the utmost protection against injury in crashes. Likewise, all applications that offer user messaging functionality should offer non-defective, non-harmful versions of it. To do that, e2e privacy is absolutely necessary.
>I just don't see the point in expecting some sort of principled stance out of them.
This is the defeatism that adds momentum to a downhill trajectory. Exactly the opposite approach arrests the slide - users expecting their applications and providers to behave in principled ways, and punishing those who do not, are what keeps principles alive. Failing to expect lawful and upright behavior out of those you depend on, be they political leaders or software solutions providers, guarantees that tomorrow's behavior will be less lawful and upright than yesterday's. Stop writing these people a pass for this horrible behavior, and start holding them unreasonably accountable for it, then we'll see behavior start to change in the direction that we mostly all agree that it needs to.
The most effective protests against internet censorship came from massive grass roots movements, with users drawing a line in the sand that they will not tolerate further impositions on their freedom.
>In some ways I think it's worse for places like Facebook to "care about privacy" and use E2EE but then massively under-resource policing of CSAM on their platform.
The irony is so manifest of billions of people having their privacy stripped by politicians and business elites in the name of protecting our children, while those politicians and business elites conspire en masse to prey on and sex traffick our children. If these forces actually took those concerns seriously, rather than sensing them as an opportunity to push ulterior motives, they'd be eating each other alive, right now. Half of DC, half of Hollywood, and at least a tenth of most major college administrations would ALL be at the docket.
Tesla doesn't have parking sensors. They're a safety feature. There's lots of safety features in cars that are optional, we've got an entire rating system for the safety of cars.
We're talking about an app that's controlled by the CCP, I do expect them to take a principled stance - stances like Taiwan is a part of China and you can't be openly critical of the leader of the party. They don't have the same principles as you. You can force them to put in E2EE, but you can't force them to be honest about it or competent about it. I would rather know what we're getting than to push them to lie.
This is the same thing as the OpenAI/Anthropic thing. You've got Anthropic taking a principled stance and getting pain for it, and you've got OpenAI claiming to take the same stance, but somehow agreeing to the terms of the DoW. Do you think it's more likely that Anthropic carelessly caused themselves massive trouble, or do you think OpenAI is claiming to have got the concessions that clearly won't work in practice. I think it's naive to think the former.
>We're talking about an app that's controlled by the CCP, I do expect them to take a principled stance
In the area of large scale internet service providers, who do you expect to take a principled stance, and why do you expect them to take it?
If the answer is, "nobody", then why keep singling out China? And if the answer isn't "nobody", then how do we apply the same pressures and principles to TikTok and other platforms that offer messaging?
This isn't some abstract concern. We know that WESTERN journalists, activists, and others have been murdered in acts of transnational repression that either began or were focused and abetted by communications surveillance aimed toward political dissidence. It seems incredibly naive to believe that current Western political and military leadership could ever be dissuaded from taking effective action (and such surveillance and repression campaigns certainly are effective) by moral qualms unsupported by strong checks and balances of accountability. In other words - this sort of repression most likely continues happening to journalists, activists, human rights lawyers, and other political dissidents, in our society, today. Enabled by the refusal of our service providers to protect us, their users.
It seems incredibly naive - civilization threateningly so - to write a pass to anyone, let alone Larry Ellison, for opting to deliberately expose "his" users to this risk. Nothing is OK about this dereliction of responsibility towards them.