Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Try the mud puddle test: log into your account on a new device using the password recovery flow. Can you see your old messages?

If the answer is yes then law enforcement can too.

https://www.forbes.com/sites/anthonykosner/2012/08/05/how-se...




Note that the mud puddle test was originally described on Matt's very blog: https://blog.cryptographyengineering.com/2012/04/05/icloud-w... :)


And it only works because a corporation likely would want to offer this to its users as a convenient feature. If they were actively trying to hide this, they can rig the test and keep access to themselves.


It is true that passing the mud puddle test does not guarantee robust end-to-end encryption (there can still be backdoors reserved for company/law enforcement). But failing it definitely guarantees that there is no robust end-to-end encryption.


> If the answer is yes then law enforcement can too.

Is it technically possible for them to see it: yes

Does Telegram let them see it: I don't think so. That seems to be the core issue around Durov being arrested.

They probably should implement E2EE for everything. Then they will have a good excuse not to cooperate, because they simply don't have the data.


> Does Telegram let them see it: I don't think so.

This is exceptionally naive. Even if he was arrested for not sharing with the French, what about for other countries? Was he arrested for not ever sharing or not sharing enough? Even if he, personally, has never shared, that doesn’t say anything about his employees who have the same access to these systems.

Your data is not private with Telegram. You are trusting Telegram. It is a trust-based app, not a cryptographically secure app.

If you trust telegram, that’s your choice, but just because a person says the right words in interviews doesn’t mean your data is safe.


You cannot be sure and yet Telegram often gets mentioned for being the only platform where states do not have easy access to user information or the ability to censor certain messages/content.

So from a broad perspective, they probably behave better than comparable services.

I think Telegram should not be trusted, but I also do not trust the alternatives, that readily share information with states. A special focus for me is that my own jurisdiction does not have access to my social media content. Other countries are secondary at first.


> Telegram often gets mentioned for being the only platform where states do not have easy access to user information or the ability to censor certain messages/content.

By who?

Simplex especially or even Signal are far better.


Or Ricochet / Ricochet Refresh: https://github.com/blueprint-freespeech/ricochet-refresh

Or Briar

Or maybe even Session?


Following the St. Petersburg attack, the Federal Security Service (FSB), in an event that may ring somewhat familiar to many in the United States and Europe, asked Telegram for encryption keys to decode the dead attacker’s messages. Telegram said it couldn’t give the keys over because it didn’t have them. In response, Russia’s internet and media regulator said the company wasn’t complying with legal requirements. The court-ordered ban on accessing Telegram from within Russia followed shortly thereafter. Telegram did, though, enact a privacy policy in August 2018 where it could hand over terror suspects’ user information (though not encryption keys to their messages) if given a court order.

...

... Pavel Durov, Telegram’s founder, called on Russian authorities on June 4 to lift the ban. He cited ongoing Telegram efforts to significantly improve the removal of extremist propaganda from the platform in ways that don’t violate privacy, such as setting a precedent of handing encryption keys to the FSB.

https://www.atlanticcouncil.org/blogs/new-atlanticist/whats-...


This doesn't make any sense. Either the author of the article is confused, lying, or is drawing conclusions from source material that is untrue.

In the US case, there was a phone where data was encrypted at rest. Though Apple was capable of creating and signing a firmware update that would have made it easier for the FBI to brute force the password, Apple refused to do so.

In the Russian case, the FSB must have already had access to the suspect's phone because if it did not then Telegram would not be in any position to help at all.

So, the FSB must have already had access. And therefore, by having access to the phone they also had complete access to the suspect's chats in plaintext, regardless of whether or not the suspect used Telegram's private chat. There would have been no keys to ask Telegram for copies of.

Alternatively, the FSB might have had access to some other user's chats with the suspect, and wanted Telegram to turn over the suspect's full data. Telegram is 100% able to do that if they want to.

As the specific part of the article you have quoted is definitely bullshit, I suspect the rest of it is bullshit too and that despite what Roskomnadzor states in public, the real fight with Durov was over censorship.


Telegram is the only messaging app that I know of which brought attention to the fact that your messages go through Google/Apple notification APIs, which seems like it would utterly defeat any privacy advantage offered by E2EE


Why? I think Google suggests that you send the payload encrypted through the notification. Google then only knows which app to send the message to, they don't know from whom the message originates (only "a Telegram server") nor what the content is.

Also, you could just send a notification instructing the app to fetch a new message from your server.

From the docs:

Encryption for data messages

The Android Transport Layer (see FCM architecture) uses point-to-point encryption. Depending on your needs, you may decide to add end-to-end encryption to data messages. FCM does not provide an end-to-end solution. However, there are external solutions available such as Capillary or DTLS.

https://firebase.google.com/docs/cloud-messaging/concept-opt...


Assuming an adversarial relationship, what sort of metadata could Google capture simply knowing which app was sending the notifications and who was receiving them?


Schneier mentioned this late in 2023:

https://www.schneier.com/blog/archives/2023/12/spying-throug...

> Wyden’s letter cited a “tip” as the source of the information about the surveillance. His staff did not elaborate on the tip, but a source familiar with the matter confirmed that both foreign and U.S. government agencies have been asking Apple and Google for metadata related to push notifications to, for example, help tie anonymous users of messaging apps to specific Apple or Google accounts.


Aren’t notifications enqueued on the server side, implying sender info is inscrutable? I’m curious what mechanism you’d propose to gather any valuable metadata given a sufficient volume of encrypted notifications.


"A Telegram server used FCM to send a message of size X to the device owned by individual Y at this timestamp and this IP address".

Nothing else.


If the text appears on your screen I'm pretty sure there are ways for Google to capture it. I don't need to know how android's API works, knowing it probably just makes one blind to the big picture. You have to trust your OS/phone maker not to do a MITM.


Yes, but Google cannot be compelled to turn over data they don't actually have on their servers because the users encrypted it before it arrived with keys Google don't control.

Signal could modify the application so a remote flag in the Play store binaries could be triggered to exfiltrate data as well. But the key distinction is the normal path of Signal gives them absolutely nothing they can tell anyone other then the bits they've put in the disclosure reports (namely: date and time an account ID used Signal I believe).


I think parent's point is, if data appears on sceen, the OS in theory can capture it and send to Google servers as screenshots or OCR'd text.


Yes, that likely is the GP's point, but it's not really relevant to the discussion going on in this thread. Certainly Google could "backdoor" its OS in that way, but they have little motivation to do so (and a lot to lose if they were to do so and were found out). Their recent move to make their location history / timeline product an on-device-only feature because they don't want to have to respond to law enforcement requests for user location data would seem to suggest they really would prefer to not have this sort of data.

At any rate, the discussion going on here is about how Durov has been arrested because Telegram refuses to respond to law enforcement requests, when they do have the ability to do so; and if they were to actually implement E2EE by default (and for group chats), Durov would likely not be in trouble, since Telegram would be unable to provide anything when requested.


> Their recent move to make their location history / timeline product an on-device-only feature because they don't want to have to respond to law enforcement requests for user location data would seem to suggest they really would prefer to not have this sort of data.

I suspect that isn’t the motivation. GDPR says that you have to give users choices about data stored like this (including right to be forgotten, how it’s processed and used and so on), and this becomes a technical, legal and commercial nightmare very quickly. The easier route is just to get rid of it if you can.

This saves Google money (it likely wasn’t that useful to sell to advertisers), makes legal compliance a lot easier and de-risks them from very large fines.

I suspect that the EU lawmakers didn’t think about second order effects like making it harder for law enforcement to access this data in scenarios like this.


Google (and Apple) has remote root over their message bus. This is reflected in the fact that they can remove spyware from people's phones remotely at any time.

Should they have to comply with law enforcement they have much more straightforward ways of doing so than capturing messages off screen.


The app can decrypt the notification before it's displayed.


I don't think the plaintext is required to be part of the API call


And yet Telegram doesn't allow to have e2ee chats on a Linux desktop or phone. You must rely on Google/Apple.


Most of Telegram clients except initial mobile apps was actually open source projects that was choosen by company to become "offcial" ones.

They just dont implement E2EE since almost no one uses it on Telegram.


This claim is what really makes me skeptical of Telegram's privacy story. Their assertion is completely incorrect. (Source: have implemented end to end encrypted payload delivery over APNs / GCM.)

And if they are so off base on this, they must either be incompetent or liars. Neither of which builds trust.


I’m old enough to remember when Signal first implemented cross-device sync using a Chrome plugin.

I’d rather developers issue cautionary warnings than give a false sense of perfect security


> Does Telegram let them see it: I don't think so. That seems to be the core issue style Durov being arrested

The UAE requires decryption keys as part of their Telco regulations.

If Telegram can operate in the UAE without VPN (and it can), then at the very least the UAE MoI has access.

They (and their shadow firms like G42 and G42's shadow firms) were always a major buyer for offensive capabilities at GITEX.

On that note, NEVER bring your personal phone to DEFCON/Blackhat or GITEX.

Edit: cannot reply below so answering here

Cybersecurity conferences.

DEFCON/Blackhat happen during the same week, so you have a lot of script kiddies who lack common sense trying to pwn random workloads. They almost always get caught (and charged - happens every year), but it's a headache.

GITEX is MENA and Asia's largest cybersecurity conference. You have intelligence agencies from most of the Middle East, Africa, Europe, and Asia attending, plus a lot of corporate espionage because of polticially connected MSSPs as well as massive defense tenders.


Sorry, but as someone who's completely out of the loop with these things. What's DEFCON/Blackhat or GITEX about and why shouldn't you bring your personal phone?

I'm genuinely interested.


defcon and blackhat are hacker/computer security conferences started by Jeff Moss (aka DT or Dark Tangent) in 1993 and held at the end of July or early August every year in Las Vegas.... The reason you don't bring your phone is it might get hacked


[dead]


Because the attendees are high-value targets who often have elevated permissions inside the firms or governments they work in, and that's worth even more.


On a separate note, Zerodium is dead now. They're in the middle of an active fire sale, but the Zero Day market's bottom fell out now that countries are increasingly moving exploit development in-house or to vendors that can do both zero day acquisition AND exploit deployment (which Zerodium cannot do as an American company).

Also, u/reissbaker's answer is correct.


Skiddies are renowned for their rational thoughts and actions.


For the lulz


Best reason of any!


AFAIK this current case has absolutely nothing to do with any form of chat features, it’s about telegram’s public channels that more or less work like reddit/twitter/any other news channels, except it refuses to censor content.


> They probably should implement E2EE for everything

He explained in his blog why he doesn't like E2EE:

https://telegra.ph/Why-Isnt-Telegram-End-to-End-Encrypted-by...

Why Isn’t Telegram End-to-End Encrypted by Default?

Pavel Durov August 15, 2017


I do not think it is a remarkable feat to be more secure than WhatsApp.


All the encryption stuff is just a red herring to a larger degree. It’s not the technical access to the information that is the issue, it is that people can share and exchange information that the various regimes do not want shared that is the primary issue. They want censorship, i.e., control of thought and speech, arresting the information flow.

They know what is being said and that’s what they want to arrest, that information can be sent and received. And by “they” I mean more than just the French. That was just coincidental and pragmatic.

The French state does not operate that quickly on its own, to get an arrest warrant five minutes after he landed and execute on it immediately. That has other fingerprints all over it in my view.


> Does Telegram let them see it: I don't think so.

I do think so: https://archive.is/M5zw4

Also, 'exile' https://istories.media/en/news/2024/08/27/pavel-durov-has-vi...


> They probably should implement E2EE for everything

Certainly not because then Telegram would lose alot of its functionality that makes it great. One thing that I really enjoy about Telegram is that I can have it open and synched across many independent devices. Telegram also has e2e as an option on some clients which cant be synched


You can sync messages across many independent devices despite e2ee.

Matrix has been doing that for years


Does Matrix encryption scale? Telegram rooms have a huge number of participants. Also last time I looked into this, Matrix encryption was also an opt in.


In Matrix all PM rooms are E2EE by default.

For public rooms however, it doesn't really make sense to enable E2EE.


Many people seem to think that Telegram tries to be a Signal or Matrix replacement. I dont think Telegram tries to be any of that. If anything you can compare it to Discord, except much better.

To enable synched e2e conversations accross many devices you also need to synch private keys, which is a security nightmare.


Either sync private keys or the messages itself.

Why would it be a security nightmare? In contrast to not even supporting e2ee in the first place?


How would you securely sync priv keys? How would you securely sync plaintext messages?

Telegram supports e2ee but it is device centric for this reason


> How would you securely sync plaintext messages?

Same as with private keys: Verified e2ee

> Telegram supports e2ee but it is device centric for this reason

At least this is what they told you


> Same as with private keys: Verified e2ee

Yikes


Anything meaningful to say?


About as meaningful as the suggestion to use "verified e2ee" to transmit private keys


Shoot


Even whatsapp does it now


Does it? Last time I used WhatsApp I could not use it on my desktop without scanning a QR code each time and keeping the phone nearby.


You need to scan the QR code only the first time using the desktop app.


Can you use the desktop app without the phone present? For example, if the phone is turned off.


I have heard you can, for about 2 weeks. Then the phone must be at least become active.


With Telegram I do not have to worry about losing my "primary" device in order to access my account. Telegram is more a social network than a chat app. Moreover Telegram does not require me to feed it my whole Contact List in order to communicate. WhatsApp has this limitation on purpose (there actually are workarounds but they are "hacks" and not how Meta wants you to use the app). It is very suspicious: why can't I search for a WhatsApp contact by manually typing in a number into WhatsApp? Instead I need to put it inside my Contact App and grant WhatsApp full access to it. LOL


There recently was a link making rounds here that purportedly allowed you to send WhatsApps to phone numbers not in your address book


Yes you can


Either Telegram will let them see it, or Telegram's CEO will go to jail. Telegram's CEO doesn't want to go to jail, so Telegram will let them see it.


they probably share it with russian authorities. Just look now. russia is allowing protests in favour of him (they only allow protest they support) and they arrested a french citizen on fake drug charges right after


Will they let _US_ law enforcement see it? No. Will they let Russian? Of course.


Source?


recent support. kremlin yesterday arranged big protests in moscow demanding his release. kremlin yesterday arrested the nephew of the french ambassador claim he was dealing drugs (claiming he carried a package of heroin marked with the label "for distribution in russia" as if all drug dealers put their intentions in writing) clearly to try to trade him


> kremlin yesterday arrested the nephew of the french ambassador claim he was dealing drugs

Source?


Jesus, it was the Onion-like parody website (Panorama)


"Telegram is working with Russian intelligence" is a theory that's been floated in quite a few places, included Wired: https://www.wired.com/story/the-kremlin-has-entered-the-chat...


Do you have some info about Durov being arrested for not letting law enforcement see encrypted messages? The public info says he was arrested for "...lack of moderation, ...[and] failing to take steps to curb criminal uses of Telegram."

I don't see anywhere saying he's been arrested for anything to do with encryption or cooperating with investigations.

eg https://www.bbc.co.uk/news/articles/ckg2kz9kn93o but pretty much all the sources I have read say the same


Well of course, but this is a feature of Telegram. It's the only messaging app where messages are stored on the cloud. This of course has security implications, but also allows you to have a big number of chats without wasting your device memory like WhatsApp does, or having to delete old conversations, and allows you to access your chats from any device. By the way you can also set a password to log in from another device (two factor authentication, also on WhatsApp now you have this option).

To me it's a good tradeoff, of course I wouldn't use Telegram for anything illegal or suspect.


> It's the only messaging app where messages are stored on the cloud.

Besides Slack and Discord and Teams and whatever the heck Google has these days and iMessage and...

I think you mean it's the only messaging app that purports to have a focus on security where messages are stored in the cloud, which is true, but also sus. There's a reason why none of the others are doing it that way, and Telegram isn't really claiming to have solved a technical hurdle that the E2E apps didn't, it's just claiming that you can trust them more than you can trust the major messaging apps.

Maybe you can and maybe you can't, the point is that you can't know that they're actually a safer choice than any of the other cloud providers.


Matrix also keeps your message on the server. Except you can run your own server. And the messages are end to end encrypted. And you can keep a proper backup of the keys.

Granted it can be clunky at times, but the properties are there and decentralised end to end encrypted messaging is quite and incredible thing. (Yes, Matrix nerds, it's not messaging per se it's really state replication, I know :))


As you alluded to, Matrix has really horrible UX. Telegram is meant to be easy for the many to use: finding content in chats or even globally across public channels for example is intuitive and snappy because their server does the heavy lifting. That's a huge sell for many, myself included.


Well, ux aside, he disproves that you can't have synced messages with e2ee


My Matrix messages are, I presume, not encrypted, because every device I have prompts me to sign this device's keys with the keys of another device (which doesn't exist) and the option to reset the encryption keys and lose access to old messages doesn't work either (it just crashes Element).


You can enable it on a per chat basis.


All PM rooms are E2EE by default.


Depends on the client.


Ah right. Yeah I keep forgetting that there are other clients beside the ones by Element because none which I initially tried were as fully featured as the Element ones.


Doesn’t Matrix replicate all chat metadata to any linked federated servers?


>it's just claiming that you can trust them more than you can trust the major messaging apps.

All the cool kids in the block eliminated the need to trust the provider decades ago. PGP: 33 years ago, OTR 20 years ago, Signal 14 years ago.


You have to trust the provider with signal; they are fiercely anti-third party clients, control the network and have released version of the code that are not tracked by sources- in extreme cases we’re aware of years old code being in there (mobile coin for example).

Signal evangelicalism needs to halt, you mean the Whisper protocol.


I don't completely agree. I am perfectly fine with there being multiple options for various use cases. Signal has its place. So does Telegram for that matter. Even Whatsapp..

That said, what I would love to see ( and likely won't at this point ) is the world where pidgin could exist again, because everyone is using some form of sensible standards that could be used.. right now it is mostly proprietary secret mess of things.

And don't get me started on convincing anyone in group to moving from one ecosystem to another. Fuck, I just want email for chat that is not owned by one org.. Is it really so much to ask ( it is rhetorical, I know the hurdles are there and only some deal with human nature )?


Like someone once said, "Pidgin is a flock of zero-days flying in formation". It had serious issues with leaking messages to other applications via dbus, I know this because I used that feature to stab in the earliest version of my work TFC.

You always forgot to enable OTR even if it was right there in front of you. You couldn't use it cross-device, and its 1536-bit DH got outdated without fixes. There's stuff like lurch that offer OMEMO but still, I really prefer that I don't have to think about key management anymore. With Signal things just work, and it's magical.


You have to trust the platform with the metadata, but the actual E2E encryption of the messages is something you can personally verify if you cared to.


You can’t know what’s running on your client. Reproducible builds aren’t reproducible, open source was not followed (there was code in the client that was not present in the repos).

So, yes, trust is needed.


No serious project wants to collaborate with a bunch of hobbyist projects who may or may not keep their code up-to-date. Years ago, the Matrix ecosystem was a prime example of even basic features like end-to-end encryption being in many cases missing.

Having a single client gives you insane boost to security agility over decentralized alternatives.

Feel free to strive towards functional decentralized ecosystem that feels as good to use, then switching will be a no-brainer.


But that's literally the entire point of this article. That is, in this day and age, when people talk about "secure messaging apps" they are usually implying end-to-end encryption, which Telegram most certainly is not for the vast majority of usages.


Also, iMessage is very secure...but then all your stuff is backed up on iCloud servers unless you specifically disable it. That includes all your iCloud encryption keys and plaintext messages.

Worse, iPhones immediately start backing up to iCloud when set up for a new user - the only way to keep your network passwords and all manner of other stuff from hitting iCloud servers is to set the phone up with no network connection or even a SIM card installed.

Did I mention there's no longer a SIM slot, so you can't even control that?

And that iPhones by default if they detect a 'weak' wifi network will switch to cellular, so you can't connect the phone to a sandboxed wifi network?

You shouldn't have to put your phone in a faraday cage to keep it from uploading plaintext versions of your private communications and network passwords.


Well summed-up. Its crazy how efficient theese things are at working together to strip users of any agency or control, across many different domains.


That is the correct default. Every day users are far more likely to accidentally lose their data than to run into government snooping.


If that is the correct default then why Telegram is blamed for having non-E2E chats by default? Maybe they also care about users who can accidentally lose their conversations. When Apple does it, it is good, but when Telegram or TikTok do the same, it is bad and not secure.


because telegram and it’s users heavily insinuate it’s comparable to Signal rather than Tiktok.

right on their front page in giant font they declare “private” and “secure” when they’re neither. it’s telegram’s own fault they receive this criticism repeatedly—and they strangely constantly complain every time they’re publicly spanked and taken to task. theyre heavily insinuating (i call it it lying) to their users and then over and over crying because they get called out.

if they don’t want to be called out then they should quit insinuating those things, it’s dangerous af. they know they’re lying though, obviously they won’t stop. but omg i wish their users would run fast and run far—it’s like watching an abused person who keeps going back to their abusive partner “oh they mean well”… pffft, no, they really dont.


Because Apple is not in the business of hosting public discussion forums.

There is no crime in implementing or not of different encryption schemes.


It might be the correct default, but it doesn't make it secure (makes it insecure actually).


> That includes all your iCloud encryption keys and plaintext messages.

Are these stored encrypted or in the clear? If the latter, please cite your source.


They are stored encrypted but whether Apple has the key depends on whether you've turned on "Advanced Data Protection" (aka "I don't expect Apple to bail me out when I lose access to all my devices"). The table in this support article details the treatment of various data categories under the two options:

https://support.apple.com/en-us/102651

The default for many categories is that your keys are in iCloud so Apple can recover them for you. With Advanced turned on, the keys are only on your personal devices. A few categories, like the keychain, are always only on your devices.

Specifically, see Note 3: "If you use both iCloud Backup and Messages in iCloud, your backup includes a copy of the Messages in iCloud encryption key to help you recover your data." Under normal protection, Apple has the key to your backups, but with Advanced they don't.


And even "advanced" protection is not advanced enough to protect your calendar and contact list from the government (under silly excuse that Apple uses standard protocols for those data).


Apple devices are also always gossiping about which devices are where


Which is one of the best features. I wouldn’t mind having an option to disable it, but then you also don’t get the advantage of others’ phones finding your device.


Luckily, microwave ovens make easy Faraday cages.


15 seconds on low, then 120 seconds on high.

Oh, you meant... oh.


laf every image you take on an iphone is sent to apple server regardless of it being in icloud or not.


iMessage only encrypted messages in RSA 1280, why do you think it is very secure?..


iCLoud can be disabled by MDM profile installed by Apple Configurator at setup.


Looks like an easy task, even your granny can do it.


It does require a few clicks and passcode entry, https://support.apple.com/en-us/102400


Why do you need a Mac and an additional software for this? This is clearly made for corporate users and not for ordinary people.


Mac is not needed, https://news.ycombinator.com/item?id=41351559

After an MDM profile is created by someone technical, it can be emailed to the non-technical user and installed with a few clicks and passcode confirmation.


Can I enroll my personal iPhone in MDM myself? And if I can have MDM with just my personal phone, do I need to buy some kind of subscription for it from Apple? Or pay some third-party?

I thought MDM was only for enterprise businesses and schools and universities, but I may very well be mistaken about that.


MDM profiles are just XML files. They can be created with any text editor and distributed to the phone by email or web server. Apple provides the free "Apple Configurator" app in the MacOS app store. There are also websites and/or OSS tools to generate profiles, e.g. https://github.com/ProfileCreator/ProfileCreator.


Apple supplies a free application for managing MDM.

https://support.apple.com/guide/apple-configurator-mac/welco...


^^^ Highly recommend this. If you are technical enough, a family managed Apple Configuration is more than enough to protect for most situations and from most threat actors.

If you're threat actor has the resource to break that, get a CC or a good lawyer on retainer I guess.


This saved me one time when I was gifted an Apple TV without a remote.

No way to add a WiFi profile, thus no way to use an iPhone as a remote. No ethernet available either.

Configured a WiFi profile, uploaded to the Apple TV and could finalize the setup.

It’s quite a powerful too for initial setup.


Many companies in the industry mislead users about encryption and just try to use it as a buzzword to attract customers. Take Apple, as example. Apple cloud backups are not E2E encrypted by default (like Telegram chats), and even if you opt into E2E encryption, contact list and calendar won't be E2E encrypted anyway [1].

Yet, Apple tries to create an image that iPhone is a "secure" device, but if you use iCloud, they can give your contact list to government any time they want.

Apple by default doesn't use E2E for cloud backups, and Telegram doesn't use E2E for chats by default. So Telegram has comparable level of security to that of the leaders of the industry.

[1] https://support.apple.com/en-us/102651


I think a high definition photo taken on a recent phone takes up an awful lot more device memory than a "big number of chats"


Yeah, but Whatsapp chats tend to be full of those... and videos.


(On Android), if you don't care about the (old) WhatsApp media, just delete it from your phone. It's all just loose files in `/storage/android/data/com.whatsapp` (or thereabouts). The text content of the chats will remain available.


Whatsapp automatically resizes them (in standard settings)

But it still gets big.


This is such a misrepresentation. Telegram could at-will feed the cloud-2FA password to password hashing function like Argon2 to derive a client-side encryption key. Everything could be backed up to the cloud in encrypted state only you can access. Do they do that? No.

So it's not as much as trade-off, as it is half-assed security design.


Telegram currently has very intuitive and snappy search, even in very active groups with years of content. That's because the heavy lifting is done by the server. Think that'd still be possible if there was no way for the server to process the data?


PCs and phones been fast enough to have snappy search on text data for years now.

Is "grep" not snappy enough for you?


Grep is inefficient search engine, because it needs to scan through whole content (and Telegram uses search indexes). Also, grep cannot deal with words forms and inflections (you type "foot" and you also want to find "feet"). Inflections are not very important in English, but you need to deal with them in other languages where the word can have many forms.


I'm not trying to claim Telegram uses grep or whatever. My point is even very active chats on telegram generate somewhat small amount of text data and I dont believe that searching through it require massive complex search engine with super-fast backend.

I basically participate in hundreds of chats and message history doesn't take 10 of GBs. And I also know that search in history of such chats isn't so snappy on older Android phone.


Not at all. Try searching 500/1000 sources (maximum number of conversations any free/premium user can be part of), each with potentially millions of messages, and providing the results in under a second.


AFAIK telegram dont have any super-advanced search features neither it instantly return you results for all these years. Also if you search less common terms it's usually take longer than less than second.

And if you just run client on device without a lot of this history cached search wouldn't be anywhere as fast as you expect. So I pretty sure there no server-side magic there, but instead very good UX.

Also I can tell for certain that with right index grepping tons of JSONs can be very effective on any modern devices.


Can't say there aren't others who've used my search terms, but some things I search for are pretty unique. And I do a lot of searching. And search speed is consistently sub-second. There's no trickery seeing years-old results from VERY heavily used chats appear instantly. There's heavy optimization happening for search, and I'm very certain default e2ee would destroy that experience. Unless you can point to 1 e2ee service working at that scale with comparable UX. I won't even touch on the local aspect as the amount of storage needed to just store the data is waaaay beyond anything I'm even interested in owning, much less the compute to make it so accessible.


> Also I can tell for certain that with right index grepping tons of JSONs can be very effective on any modern devices.

But to run local search you need to download the conversations to device first which might require lot of (expensive) traffic.


Yeah, try searching anything older than a year, the amazing snappy search grinds to halt. Meanwhile I'm storing years worth of stuff on Signal with no issues, and it searches ridiculously fast offline with no seconds long pause for buffering.


So interesting. I just did a search for mentions of someone I know in multiple Telegram groups and channels, and got all the results, going back 5 years, instantly. And these groups and channels have millions of messages. All media is also perpetually available (unless deliberately deleted), and take a couple seconds to load. I don't see any other platform having that kind of convenience.


Yeah Telegram search is not in a state where anyone should be proud of it.


Apple could also use E2E for their cloud backups by default, but they don't (and if you enable E2E, it doesn't apply to contact list and calendar backup anyway). Why do you demand more from Telegram than from Apple or Google?


I'll have you know they had maths PhDs design their security, sir. Eight of them!

Yeah, it's a bit of a joke.


Yeah, put a geometrician* to do the job of a cryptographer. This is what you get.

* I'm being serious, Nikolai Durov's PhD dissertation title was "New Approach to Arakelov Geometry"

https://bonndoc.ulb.uni-bonn.de/xmlui/handle/20.500.11811/31...

https://arxiv.org/pdf/0704.2030


Advanced math is actually more difficult (in my opinion) than programming languages.


Cryptography is nightmare magic math that cares about the color of the pencil you write it with.

It's not enough you know how to design a cipher that is actually secure, you need to know how to implement it so that the calculator you run it on consumes exactly the right amount of time, and in some cases power, per operation.

Then you need to know how to use the primitives together, their modes of operation, and then you get to business, designing protocols. And 10% of your code is calling the libraries that handle all that stuff above, 90% is key management.

There's a good amount of misuse resistant libraries available, but Nikolai was too proud to not look into how the experts do this, and he failed even with trivial stuff: He went with SHA-1 instead of SHA-256. He didn't implement proper fingerprints. His protocol wasn't IND-CCA secure. He went with weird AES-IGE instead of AES-GCM which is best practice. He used the weird nonces with the FF-DH, instead of going with more robust stuff like x25519.

One thing you learn early in academia, is that expertise is very narrow. I bet he knows a lot about geometry. Maybe even quite a bit about math in general. But it's clear he doesn't know enough to design cryptographic protocols. The cobbler should have stuck to his last.

EDIT, to add, the real work with cryptographic protocols starts with designing everyday things that seem easy on the paper, with cryptographic assurance. Take group management that the server isn't controlling.

For Telegram it's a few boolean flags for admin status and then it's down to writing the code that removes the user from the group and prevents them from fetching group's messages.

For Signal it's a 58 page whitepaper on the design of how that is done properly https://eprint.iacr.org/2019/1416.pdf

This is ultimately what separates the good from the bad, figuring out how to accomplish things with cryptography that first seem almost impossible to do.


Sure, but cryptography is its own subfield of advanced math (and also a bunch of more CS and UX based implementation challenges like avoiding side channels).


> It's the only messaging app where messages are stored on the cloud

Unreal. Please share how you came to this world view.


> Well of course, but this is a feature of Telegram. It's the only messaging app where messages are stored on the cloud.

Wrong, Matrix does it too, but fully e2ee.

> and allows you to access your chats from any device.

No it doesn't, because it is possible withh e2ee as well


> It's the only messaging app where messages are stored on the cloud.

Instagram. FB Messenger. Skype. LINE. KakaoTalk. Discord. Slack. Teams. iMessage.


Google talk/Hangouts/Google Chat/Duo/Allo/Meet/another Meet/etc. Counts as one


You never know what may suddenly become illegal.


>It's the only messaging app where messages are stored on the cloud.

So do all the others with the exception of something like IRC.


Not really. WhatsApp only keep them temporarily (and E2EE!) until they're delivered to each device. Signal too. Telegram keeps everything for all time. Which is kinda handy too I have to say.

Of course you can send your backup to Google for WhatsApp and signal but that's optional. You can keep it locally too. And it's encrypted too. With WhatsApp you can even choose to keep the key locally only.


WhatsApp? The closed source app that AFAIK has never been externally audited, owned by one of the most privacy-disrespecting corporations in the world? You say I can trust it wholeheartedly as long as I don't upload backups to the cloud?


The founder departing Meta on very bad terms is quite a signal to me:

https://www.forbes.com/sites/parmyolson/2018/09/26/exclusive...


I 100% trust they implement the signal protocol as they claim. I am also similarly sure that they ALSO have a sidechannel for everything.


That's it. The article could be just that. You log back in and all your messages are there without you having to provide a secret or allow access to some specific backup? Your data just lives on the server. The only thing preventing anyone from accessing it is the goodwill of the people running the server.


Not true. Secret chats only live on a device where you started it. Regular people may not use them (their problem), but these are common for business-critical chats in my circles.


Indeed and this is the other thing - even if Telegram don't themselves co-operate with law enforcement, it'd be fairly easy for law enforcement to request access to the phone number from the carrier, then use it to sign into the Telegram account in question and access all of the messages.


You can set a password that’s required to authenticate a new device.

Once that’s set, after the SMS code, then (assuming you don’t have access to an existing logged in device because then you are already in…), you can either reset the password via an email confirmation _or_ you can create a new account under that phone number (with no existing history, contacts, etc).

If you set a password and no recovery email, there is no way for them to get access to your contacts or chat history barring getting them from Telegram themselves.


If you apply this test to things like LastPass or Bitwarden they fail too. And yet the don't keep my unencrypted passwords on their servers.


If you lose your Bitwarden master password you've lost your data. It passes the mud puddle test.


ah, you are right ... I missed the password recovery flow part which is a key thing here


I'm probably dumb, but why would that be proof?

I upload encrypted backups to a cloud service provider (AWS, Google Cloud). I go to another computer, download them, use a key/password to decrypt them.

Sure, I get it, you're typing in something that decrypts the data into their app. That's true of all apps including WhatsApp, etc... The only way this could really be secure is if you used a different app to the encryption that you wrote/audited such that the messaging app never has access to your password/private key. Otherwise, at some point, you're trusting their app to do what they claim.


> > using the password recovery flow

> use a key/password

The previous poster intentionally mentioned password recovery flow. If you can gain access without your password, than law enforcement can too. If you could only gain access with your password, you could consider your data safe.


> If you could only gain access with your password, you could consider your data safe.

You can't assume the negation.

If you can get access without your password then you have proven that law enforcement or the hosting company can to.

If you can't get access then you haven't proven anything. They may be securely storing your data end-to-end encrypted. Or they may just have a very strict account recovery process but the data is still on their servers in the clear.


Offhand, this sounds like a terribly insecure workflow but...

Client creates a Public Private key pair used for E2EE.

Client uses the 'account password (raw)' as part of the creation of a symmetric encryption key, and uses that to encrypt and store the SECRET key on the service's cloud.

NewClient signs in, downloads the encrypted SECRETKeyBlob and decodes using the reconstructed symmetric key based on the sign in password. Old messages can then be decoded.

-- The part that's insecure. -- If the password ever changes the SAME SECRET then needs to be stored to the cloud again, encrypted by the new key. Some padding with random data might help with this but this still sounds like a huge security loophole.

-- Worse Insecurity -- A customer's device could be shipped a compromised client which uploads the SECRET keys to requesting third parties upon sign-in. Those third parties could be large corporations or governments.

I do not see how anyone expects to use a mobile device for any serious security domain. At best average consumers can have a reasonable hope that it's safe from crooks who care about the average citizen.


> When you regain consciousness you'll be perfectly fine, but won't for the life of you be able to recall your device passwords or keys

You can't use your password as input to the mud puddle test.


How would an end user even know they're running that test for a closed box system? The idea is what's possible in the real world.


I know this is getting off-topic, but all the discussion about encryption missing an important weakness in any crypto algorithm - the human factor.

I found it interesting that countries like Singapore haven’t introduced requirements for backdoors. They are notorious for passing laws for whatever they want as the current government has a super majority and court that tends to side with the government.

Add on top Telegram is used widely in illegal drug transactions in Singapore.

What’s the reason? They just attack the human factor.

They just get invites to Telegram groups, or they bust someone and force them to handover access to their Telegram account. Set up surveillance for the delivery and boom crypto drug ring is taken down. They’ve done it again and again.

One could imagine this same technique could be used for any Telegram group or conversation.


Would love to see a side-by-side comparison of iMessage, Signal, WhatsApp and Telegram on this.


You already know how Signal is going to come out here, because this is something people complain incessantly about (the inconvenience of not getting transcripts when enrolling new devices).


It's a bit unfortunate there isn't a mechanism to establish a key between your desktop and smart phone client that would allow message history to be synced over an E2EE connection. It's doable, but perhaps it's an intentional safety feature one can't export the messages too easily.


I agree with the principle here wholeheartedly. One addendum though is I think this isn't quite the same as the mud puddle test. The idea behind the mud puddle test is if you've forgotten everything, but then manage to recover your data, then the principle must be that someone other than you has to have had access. With Signal, they intentionally refuse to sync data as an extra security step even if you have the keys, the software just refuses to do the syncing step. I'm glad they do personally and I'm not contradicting your point, just adding some notes. Just thought it worth noting.

Edit: Actually, yeah that proves your point.


This isn't fully accurate. You can backup your Signal messages on Android with an encrypted file and a key you control. So yes, just installing on a new device isn't going to give you history. I'd prefer they offer a universal structure for that backup file so we could easily switch between Android and iOS and have some way to backup your data at all on iOS (presently if anything goes wrong when setting up a new phone you lose your entire message history).



Matrix doesn't allow this. You need a dedicated chat key in addition.


Also the same with Skype "encryption". The data is "encrypted", but you receive the private key from the server upon sign-on... So, just need to change that password temporarily.


How to do that on initial account creation:

- locally create a recovery key and use it to wrap any other essential keys

- Split that or wrap that with two or more keys.

- N - 1 goes to the cloud to be used as MFA tokens on recovery.

- For the other, derive keys from normalized responses to recovery questions, use Shamir's secret sharing to pick a number of required correct responses and encrypt the Nth key.

You can recover an account without knowing your original password or having your original device.


IOW, you've made the recovery questions into alternate passwords, passwords that law enforcement is likely able to find or brute force.


Telegram has an answer to this: https://telegram.org/faq#q-do-you-process-data-requests - only Secret Chats are e2e encrypted.

As an alternative, Signal or Jami conversations are always e2e encrypted.


Unless you can prove (e.g. using your old device or a recovered signing key) that the new device is yours. In that case, if the service supports it, the new device could automatically ask your contacts to re-send the old messages using the new device's public key.


Telegram has secure calls and secure e2e private chats. All other chats are cloud-backupped. So if you have an intent of using private communication - the answer is "no", if you don't care - the answer is "yes"


Unfortunately if the answer is no, it does not mean law enforcement can’t


Why not the "founder locked up" test? If the founder claims secure encryption, yet they are not in jail, that means there's no secure encryption because they negotiated their freedom in exchange for secret backdoors.


Maybe, but not a good litmus test. If it’s truly secure and the founder can’t provide information because they don’t have access to it it’s also possible they can’t build a case in most countries.


In Russia too?


That isn’t applicable here. Telegram isn’t encrypted and yet they refused to comply with subpoenas. Companies whose customer data is encrypted can truthfully say that they have no way to access it for law enforcement. Telegram can’t.

Maybe in the future, creators of encrypted messaging apps will get locked up. I certainly hope not. But this case doesn’t indicate anything one way or another.


> Companies whose customer data is encrypted can truthfully say that they have no way to access it for law enforcement. Telegram can’t.

I dunno man, kinda seems like you ought to either have a right to privacy or not. Surely there's other ways to make a case, without extraordinarily abusable legal strong-arming.

Why should a wealthy person be able to legally afford encrypted communication on a secure device, when 90+% of people can't because they're poor and tech illiterate?

Does our historically unequal society need more information and rights asymmetry between rich and poor? Between privileged and marginalized?


Downloading Signal is just as easy as downloading Telegram.


As I said, tech illiterate - or as likely, legally illiterate.

It's unreasonable to expect most people to intuit the distinction you describe.

However, you don't see wealthy people communicating on insecure devices, because they have people to take care of that stuff.


I'm really not sure what you're referring to. You see lots of wealthy people communicate on insecure devices, and it's quite common for law enforcement to demand and obtain the contents of their communications. "Look at these terrible messages we subpoenaed" is a staple of white collar criminal prosecutions.


* White-collar crimes are estimated to make up only 3% of federal prosecutions.

* White-collar crime prosecutions decreased 53.5% from 2011 to 2021.

* Annual losses from white-collar crimes as of 2021 are anywhere from $426 billion to $1.7 trillion. The wide range here is due to the lack of prosecutions.

* There were 4,180 white-collar prosecutions in 2022.

* It’s estimated that up to 90% of white-collar crimes go unreported.

Etc.

- https://www.zippia.com/advice/white-collar-crime-statistics/

***

Responding by edit due to rate limit:

Guys the connection is clear if you think about it.

High-net-worth individuals use encrypted messaging apps more than the general population, without doubt.

They also have far more resources and abilities to fight a subpoena. It's all distinctively unfair and highly misleading to normal people; for very little real reason and with great potential for abuse.


Most prisoners in the US though are state prisoners (i.e., convicted by a state court) not federal prisoners (by a large margin I think). Lots of people are convicted in a state court for example of showing up at a bank branch with fake id and trying to cash a check. I gather that would be considered a white-collar crime?


I don't understand the connection between these statistics and your claim that wealthy people don't use insecure messaging apps.


You change the subject in each comment and it’s not clear how any of this relates to Telegram.


Yeah, and the only way to get government to learn about why e2ee is important is to show them that if law enforcement can get it, then so can hackers/phishers. We need as many politicians dark secrets hacked and ousted as possible. It should be a whistblower protected right codified into law to perform such hacks




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: