Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Opmsg – A GPG Alternative (github.com/stealth)
121 points by wrench4916 on July 19, 2019 | hide | past | favorite | 97 comments



I am not qualified to review how it implements forward security, for instance.

But this shares a lot of the problems that GPG has. It relies on existing mail standards, so it leaks metadata all over the place, and security can easily be defeated by "accidentally replying without encrypting." It's configurable -- every choice you have to make is a chance to make the wrong one. It implements RSA, which nobody should be using anymore.


Indeed, from Latacora's recent "The PGP Problem":

> Encrypting Email

> Don’t.

> Email is insecure. Even with PGP, it’s default-plaintext, which means that even if you do everything right, some totally reasonable person you mail, doing totally reasonable things, will invariably CC the quoted plaintext of your encrypted message to someone else (we don’t know a PGP email user who hasn’t seen this happen). PGP email is forward-insecure. Email metadata, including the subject (which is literally message content), are always plaintext.

> If you needed another reason, read the Efail paper. The GnuPG community, which mishandled the Efail disclosure, talks this research down a lot, but it was accepted at Usenix Security (one of the top academic software security venues) and at Black Hat USA (the top industry software security venue), was one of the best cryptographic attacks of the last 5 years, and is a pretty devastating indictment of the PGP ecosystem. As you’ll see from the paper, S/MIME isn’t better.

> This isn’t going to get fixed. To make actually-secure email, you’d have to tunnel another protocol over email (you’d still be conceding traffic analysis attacks). At that point, why bother pretending?

> Encrypting email is asking for a calamity. Recommending email encryption to at-risk users is malpractice. Anyone who tells you it’s secure to communicate over PGP-encrypted email is putting their weird preferences ahead of your safety.


If you know what you're doing, PGP can improve security. The real problem is that, the moment you're sending information to someone, you're giving that information away, out of your control.

If I understand it correctly, all the above points seem to address mostly the technical aspect, that someone who means well may too easily leak previously encrypted information out of ignorance.

A possible counter argument could be this: You already need to trust a recipient that they won't leak the data you send them willingly and with harmful intent, so it's not that much to ask that people also trust the competence of the recipient.

What Opmsg really seems to be about is cases where you don't trust the recipient to not betray you, intentionally or unintentionally.


No. Secure Messengers are designed to be hard to use unsafely. Nobody accidentally sends plaintext to a counterpart with Signal, because there's no feature in Signal that does that.


Strange example. Signal does support unencrypted sms which would make it easy for someone to forward a previously encrypted message as plain text.


If you can't trust the recipient, there's nothing you can do. The point isn't to protect yourself from an untrustworthy recipient, it's to protect yourself from inadvertent disclosure.


You can still send text messages (including to Signal contacts by long-pressing the send button). Obviously it is harder than just sending an encrypted message.


Email is really difficult-to-impossible to secure correctly.

Take metadata. Because of how email works, it's effectively impossible to hide the To, From, and Date (or more accurately, the Received) headers. If you're worried about three-letter agencies, that's the only metadata they need, so you're screwed before you started. Theoretically, S/MIME allows you to encrypt additional headers (including the venerable Subject header), and has done so for 15 years, but I'm unaware of any email client that actually supports this feature, and the downgrade mode is pretty UX-hostile.

Another very challenging problem is that the flow of email pretty much destroys any chance of using a good secure cryptosystem. The email sender is not necessarily able to establish a direct, synchronous contact with the email recipient, even on a server basis. That makes protocol negotiation and perfect forward secrecy difficult. Not to mention that users generally expect to be able to open up email clients on unknown machines (especially webmail clients), which means practical key distribution tends to amount to "give your provider your keys," at which point the security advantage over the current state of all-connections-are-wrapped-in-TLS is negligible.

There's also the point that email's main advantage as a messaging system is its universality. But any new protocol is going to suffer from being supported in a small section of clients at first. If you can't get major email clients on board--and that includes webmail clients--then the universality of email is no longer an advantage in your proposed protocol. And if you're going to have to use a different client already, why bother with doing all the crap you have to deal with email syntax, MIME, SMTP, and IMAP?

About the only use case I can reasonably see for encrypting emails is in workflows like Bugzilla's "secure email" feature: the system is already relying on email for communication, there is a clear way to handle the recipient's key and protocol negotiation, and the metadata is irrelevant to secure.


Doesn’t To and From metadata leak on encrypted-message apps, too? WhatsApp, Telegram and Signal use mobile phone numbers as identifiers. In many countries now, you cannot buy a SIM card unless you show the vendor proof of identity, a copy of which is then provided to the state for its records. Therefore, any state actor that can put pressure on those encrypted-messaging services can reveal which phone numbers are talking to who, and therefore who is talking to who. Sure, the message text might still be secure if the end-to-end encryption works, but the metadata is vulnerable.


> WhatsApp, Telegram and Signal use mobile phone numbers as identifiers

One notable exception (which as I understand is tptacek-approved) is Wire, which only needs an email.


I concede that Wire has essentially lifted the core crypto bones from Signal. But metadata collection is a clear distinction between Signal and Wire, so if what 'jcranmer is saying about email metadata is persuasive to you, your choice between Wire and Signal should be clear: use Signal, not Wire.


Well that settles it. Better do nothing then!


> It relies on existing mail standards

This is a feature, not a bug. Nobody actually wants to rely on a single entity (for or non-profit) for their communication. Nobody wants to be stuck in crappy Electron and mobile clients.

I had some hope that Matrix may be able to alleviate those concerns and provide a modern, federated chat solutions. Unfortunately their quality of implementation seems to be rather low with slow, laggy and resource-hungry clients and ridiculously resource-hungry servers and their current setup still apparently include a single "identity server".


> This is a feature, not a bug. Nobody actually wants to rely on a single entity (for or non-profit) for their communication. Nobody wants to be stuck in crappy Electron and mobile clients.

And nobody facing a nation state adversary wants to give their chat client their phone number. I really couldn't agree with you more on the nature of the problem and the status of the available solutions. It's really unfortunate that Matrix is as unfinished as it is. At least cross signing [1] is nearly done, which should eliminate the major issue making it unusable for anyone who's not technically competent and very dedicated.

[1] https://github.com/vector-im/riot-web/issues/9631


I think that one of the really bad problems with gpg and openssl is that for a long time there was effectively only one implementation.

So the key thing for matrix is to create a healthy ecosystem that has multiple implementations of the protocol and make sure that the protocol can actually evolve.

Note that the 'identity server' is an optional component. As long as you stick to matrix native user IDs, there is no need to use one.


Have you used Riot on mobile recently? Sure it's not perfect, but it consistently outperforms FB Messenger on my very poor phone.


No, mostly because the Matrix homeserver still needs ridiculous amounts of memory. Apparently 1GB if you want to join one of the more crowded channels – what for could a chat server actually use 1GB of memory?! That's a billion characters you can store in there, even at hundreds of messages per second (which I suppose would make the channel unusable anyways?) you don't get to a billion characters very quickly.


> Nobody actually wants to rely on a single entity (for or non-profit) for their communication

You don't have to do that. Protocols like tox for example are distributed and use DHT in order to find peers.


Most people really don't care. They use Whatsapp because that's where their friends are and they will move to whatever their friends start using next.


'People' is a poorly defined notion here, and cannot be really used for any sane conclusions. Users have different practical, and security needs, therefore different priorities which define their behavior. PGP (and its alternatives) is to Whatsapp (and its alternatives) as apples to oranges.


Okay, the vast majority of the general population, the billions of people using apps like Whatsapp.

Note that the comment I replied to started from the rather categorical "nobody".


This screenshot of an Ars Technica journalist with Phil Zimmerman, PGP author (yes, not the same as GPG), is very telling:

https://arstechnica.com/information-technology/2016/12/op-ed...


I'm not sure what you're implying, but without context, the screenshot is meaningless.

It could be that Phil has a policy of not storing private keys on his iPhone or something. Is that so unusual?

Anyways, maybe you're privy to context that I'm missing.


My interpretation is that even if the people who are technically savvy in that specific area, if they have times they won't deal with encrypted information, how often are non-technical people going to want to deal with it?


According to the screenshot, it's not the he won't, it's that he can't.


I would say the implication is that PGP is useless for general secure communications purposes. You can be secure with it when you have very specific needs and have knowledgeable contacts, but it doesn't solve the need for communications privacy that most people have.


It's not a problem to be blamed on PGP though. A solution to shield an Average Joe's private life from his internet provider's snooping will always have its own deficiencies, exactly because of its generality.


>It implements RSA, which nobody should be using anymore

Can someone elaborate a bit? My impression was that RSA is fine with long keys, elliptic curves mainly provide shorter keys, and no decent quantum resistant algorithm emerged?


There's nothing cryptographically broken about RSA as a cryptosystem per se, but implementing it correctly is difficult. There have been multiple revisions to the standards for RSA over the years in response to various attacks. The current standard is PKCS#1 v2.2 (https://tools.ietf.org/html/rfc8017) and we should use RSAES-OAEP / RSASSA-PSS as primitives.

However a PKCS#1 1.5 compatibility mode with fixes for Bleichenbacher's oracle is also present in this standard and also specified in TLS 1.2 for compatibility reasons. In order not to provide an oracle, padding and other properties must be verified and a random premaster secret returned on failure instead of an error message, see: https://tools.ietf.org/html/rfc5246#section-7.4.7.1 . Note that under the other techniques, there are a lot of caveats and remarks.

Keys must also be carefully generated, as demonstrated by ROCA. In this case a specific format of primes was used to make prime number generation faster, which unfortunately also happens to be vulnerable to attack by Coppersmith's method. Any such key is weak in the sense that the private key can be recovered.

This is a quick overview. There's a lot of literature on attacks on RSA, particular parameter choice etc. So one argument against its use is the many issues that exist and the fact that even for experts, implementing it correctly is not easy.

The other point is that, compared to Elliptic Curve cryptosystems, RSA is (usually) more expensive as an operation and key generation is certainly more expensive. As a result, elliptic curves work better in constrained environments like on smartcards. If you want to do forward secrecy by periodically generating and throwing away random keys, elliptic curves can do this more efficiently - RSA by contrast would be slow.


RSA is hard to implement and requires a lot of key material to change hand.

On the flipside, a Ed25519 or Ed448 key can be reasonably dictated over phone (though you might need three minutes) and put into small low-res QR codes.

Additionally, Ed25519/448 are dead simple to implement; following the reference from the RFC documentation, you can implement a safe cryptographic method (encrypt/decrypt/sign/verify). You actually have to go out of your way and do things the standard doesn't include to make it unsafe.

Compare with RSA, where such a naive implementation will make you suffer through atleast 30 CVEs of the "padding oracle" or "leak private key" type.

While RSA is fine from a mathematical standpoint, Ed25519/448 are much simpler to implement with much less code and are designed to be reasonably safe. They provide the same security as a ~3500 bit (or about 4500 bit for 448) RSA key, so it's on the safe side of things.

There is no Post-QC algorithm yet, atleast none that made it through the NIST competition, some of them do involve using RSA with absurd key sizes and they'll likely fail the competition.


> some of them do involve using RSA with absurd key sizes and they'll likely fail the competition.

Only one (specifically DJBs joke Post-QC algorithm), and it did not pass to the second round.


I don't understand the argument of dictating a key over the phone. If you care about that use case, you can just a well dictate the hash of an RSA public key.

Nobody knowns of the NIST curves have backdoors. Nobody know s if Dan Bernstein's curve has issues.

The advantage of RSA, is that we actually know how it works. I continue to be amazed that so many people advocate curves that we don't undertstand.

Of course, the big problem with RSA, is that is seems simple enough that people try to implement it themselves and get it wrong. With EC, you basically have to use one of the standard libraries.


>Nobody knowns of the NIST curves have backdoors. Nobody know s if Dan Bernstein's curve has issues.

There are still some differences. NIST provides some curves and doesn't explain much about them. You can read up how DJB choose the curves. It's a very neat and tidy process that is easy to follow and very reasonable.

ECurve isn't terribly more complicated than RSA, they both rely on multiplication, though EC does it in 2D space and RSA in Modulo space. I have implemented EC myself, the underlying math isn't that much more complicated, really, than RSA.

>I don't understand the argument of dictating a key over the phone. If you care about that use case, you can just a well dictate the hash of an RSA public key.

Not only dictating over phone but for example typing a SSH key over a serial line while not being able to copy-paste directly. I've had to suffer than with my RSA4096 key once and it's NOT pleasant at all.

it also means you have much less overhead for the protocol (DH and KEX with Ed25519 only need 32 bytes of space per step instead of kilobytes)


> There are still some differences. NIST provides some curves and doesn't explain much about them. You can read up how DJB choose the curves. It's a very neat and tidy process that is easy to follow and very reasonable.

You're not wrong, but to paraphrase a famous quotation: No body every got fired for following Suite B.

AES, SHA-2, and the NIST curves are approved for government crypto, and are also probably in many industry regulations. If there's ever an incident and a post-mortem audit, then it's a lot easier to explain the choice of Suite B algorithms.


Nobody was ever fired for using DJB. Meanwhile I would gladly fire someone for using AES128 or the NSA-sponsored curves, despite being in Suite B.


> Meanwhile I would gladly fire someone for using AES128 or the NSA-sponsored curves, despite being in Suite B.

Why? Pointing SSL Labs at my bank, it's what they use (ECDH secp256r1). What does your bank use? Or is there a site that you consider more important than that one?

Would you fire the folks at Let's Encrypt, who only offer certs of RSA and P-{256,384}? Gmail, where they do offer x25519, but where most browsers use secp256r1?


Banks are not known for using the best/safest solutions. Just take 4 digit pins and 3DES into account for example.

> who only offer certs of RSA and P-{256,384}?

I am pretty sure that nginx and openssl only recently added support for ed25519 certificates. Although to be honest I don't really like the idea of let's encrypt. The addressing system that tor uses has solved that issue already.

> but where most browsers use secp256r1?

This is an issue. Browser vendors should prioritize the djb algorithms.


Which aspect of elliptic curves would you like to understand better? The original paper for Curve25519 contains a dedicated subsection for attack models, for example, and leaves only marginal room for hidden backdoors with its detailed reasoning about curve parameter choice. The implementation of EdDH or EdDSA is specified in RFCs that are explicitly written to be "fool-proof", as others already commented.


Compare the NIST curves to RSA. For RSA we know there cannot be any backdoors. If you generate good quality primes you are in business (assuming you don't make mistakes elsewhere).

For NIST we cannot say anything about backdoors. We don't use those curves because we don't trust NIST. Not because we have any prove they are bad.

So to avoid that, there is a parameter selection process that supposedly leaves no room for backdoors, though at some CCC congress DJB described how you could use a similar process to add backdoors.

So basically, EC is based on magic. We cannot prove it is bad, we just have to hope there is no hidden magic.

Note you say 'only marginal room'. Soon the whole world will use exactly one curve. With 'only marginal room for hidden backdoors'.

I feel way more comfortable to know that with RSA what you see is what you get.


Even Bernstein doesn't really believe the NIST p-curves are backdoored, and the Koblitz/Menezes paper makes a pretty decent case that they couldn't be, but if you want to tinfoil hat it, just do what every modern system does and use Curve25519.

If any of this is new to you, though, you shouldn't be designing cryptosystems. Most people shouldn't! I sure shouldn't! It's an extremely specialized skill, and the world doesn't need that many new ones. Just use Nacl.


> but if you want to tinfoil hat it, just do what every modern system does and use Curve25519.

What is your take on the NIST curves being "officially" blessed for government data via Suite B (or whatever they're calling it)?

If it's good enough for government work, would it be good enough for us in the private sector? What are the chances the the NSA know weaknesses in Curve25519 or ChaCha like they knew about differential cryptanalysis attacks DES ahead of everyone else?


Frankly I think the kremlinology is a lot less interesting and useful than the engineering facts, which are that Curve25519 is more misuse-resistant, faster, and easier to implement in constant time. People shouldn't be using the P-curves anymore.


99.999% vulnerabilities come from bad implementation, not "magic"!


https://blog.trailofbits.com/2019/07/08/fuck-rsa/ covers that.

The executive summary is that while it's possible to implement and use RSA properly:

* it looks easy to implement so it's common for people to roll their own (and then it's insecure or broken), ECC look hard (though aren't necessarily) so devs are more likely to use properly vetted libraries

* because most of the parameters must be kept secret, good advice is hard to find (and good parameter selection is absolutely critical), not to mention e.g. good exponents has complexity and performance implications[0], ECC parameters are public and you can pick existing good ones.

[0] and security grows sub-linearly with parameter size: 2048-bit RSA has 112 bits of security, 4096-bit RSA has 140 bits of security


That article was discussed on HN about a week ago and there are some interesting comments on that thread: https://news.ycombinator.com/item?id=20381779


I'm just interested but it reminds me of https://facthacks.cr.yp.to/ which is a horizontal attack, finding the weakest keys in a big set of them, and batch factorization attacks. This seems more developed for RSA keys?


It implements EC with _fallback_ to RSA if EC is not available. Blame OpenSSL, I guess, because this relies on it. Even its CLI seems to be inspired by the atrocious OpenSSL CLI.


Fallback in crypto is wrong. Either succeed or fail. I don't need to sit there guessing how secure my secure messenger is.


accidentally replying without encrypting

There are automated solutions for this in existence for many-many years.


Forgive me if I can't read but I couldn't quickly find the following. Is it possible to use Opmsg as a library?

If you want to use GPG as library in your application right now, you can't. The best thing you can do is parse GPG output which is ... less-than-ideal at best and downright wrong and dangerous at worst. If one can use this as a library rather GPG that would be a huge win.


I've been reading about crypto lately and https://download.libsodium.org/doc/ seems to be the goto library for what you're pointing out.


The perfect forward secrecy here seems to involve deleting a "persona". I an not sure how that is any different from doing PFS by deleting a PGP subkey. Like with PGP if you do this you lose email archived under that persona. There is no automation to re-encrypt the archive with a different key.

So I am not really sure if this has a killer feature that would make anyone want to go to the bother of abandoning PGP...


I think a perhaps unclear part of the recent post "The PGP Problem" is that PGP is bad for email.

If you don't use it for email, I don't see it as really a problem. Unless, maybe, you are a reporter or otherwise not clear on the principles behind using something like GPG. I think personally that the point about all the discussion is that for laypeople PGP and email is just too complicated (even for myself as a programmer and evidently for others it is complicated).

In that same vein, I can see how PGP has fundamental limitations with email, e.g.: Having someone's email address does not imply that you have their public key. Is it possible to state in simple terms whether OP's program does to improve this?


> I think a perhaps unclear part of the recent post "The PGP Problem" is that PGP is bad for email. If you don't use it for email, I don't see it as really a problem.

You're apparently asserting that Latacora's "The PGP Problem" states PGP is only bad for email. I can only assume you didn't even bother the article? Because it states that PGP is bad:

* period and in its entirety, in fact most of the article (section 1 "The Problems") is the various ways PGP and GnuPG are broken at the core, specific scenarios are only mentioned (in section 2 "The Answers") to provide alternatives, because Latacora's assertion is that in cryptography one size does not fit all, and each scenario needs its own toolset

* for securing messenging

* for securing email messages

* for signing files and packages

* for encrypting files, whether to send, backups, application data, …

What it does state with respect to email is that encrypting emails is a fool's errand not just that using PGP to do so is a mistake. It does note that GnuPG is also specifically bad at it, but very clearly state the issue is not limited to PGP:

> This isn’t going to get fixed. To make actually-secure email, you’d have to tunnel another protocol over email (you’d still be conceding traffic analysis attacks). At that point, why bother pretending?

> Encrypting email is asking for a calamity. Recommending email encryption to at-risk users is malpractice.


So what's the alternative, for asymmetrically encrypting arbitrary binary data?

For sending such data to others, the Latacora article suggest a tool that I've never heard of or heard recommended by other experts called "Magic Wormhole". It's a new tool that (from what I call tell) has a whole crapload of limitations and assumptions that PGP does not have: https://magic-wormhole.readthedocs.io/en/latest/welcome.html...

One of the most severe is that it apparently requires both ends to have active internet connections to transfer the data over the wire between them. As I type this, I'm visiting my parents who have 5 Mbps internet. Let's hope that file isn't big or my contact has the time and patience to wait if it is! The other is that it apparently relies on a shared password, which just takes us back to encryption before the very problem PGP was designed to solve...

For simply "encrypting files", even Latacora gives up and says "use PGP"!!

As a side note, though I agree with them that PGP is not good for secure messaging, I don't find their alternatives convincing there either. Signal and Wire don't have solid group chat capabilities that don't rely on a single central server run by a third party and don't require private information like a phone number to use. I consider that absolutely basic for a good messenger. At least PGP, though very faulty in this area, is designed to be used over existing protocols like email to make them secure, so it doesn't have the last two limitations.


Wormhole isn't new, and if you haven't heard another "expert" recommend it, you don't hang out with a lot of cryptography engineers.

The point about "simply encrypting files" is that nobody is implementing something with PGP's "encrypt-a-file" interface because it's not that useful; rather, people purposefully design modern systems with cryptography tailored to tasks, like messaging or file transfer or backup.

Your last point about PGP vs. Signal is pretty funny, as it implies that PGP has "solid group chat capabilities".


> Wormhole isn't new, and if you haven't heard another "expert" recommend it, you don't hang out with a lot of cryptography engineers.

You're right, I don't. But the earliest thing I found about Wormhole after a quick look was from 2015, which I think is pretty recent in the crypto world. Maybe I missed something.

> Your last point about PGP vs. Signal is pretty funny, as it implies that PGP has "solid group chat capabilities".

I tried to be clear about the fact that I wasn't saying that. Its benefits over Signal and Wire (but not Matrix) are that it doesn't require a central server and doesn't require any PII to sign up. I consider those crucial for anyone who has extreme security / privacy needs. PGP completely sucks for group messaging, I agree. But the alternatives suggested are simply non-starters for many use cases.


Tell me more about what your book says about the crypto world? Is Noise ok now?


Maybe? It depends on your requirements. Don't most experts recommend extreme caution with cryptography approaches and software that's less than a decade old? Has that changed? Do we move fast and break things now too?

Would also like to hear your thoughts on why / whether Signal and Wire are actually good recommendations.


I'll tell you what I don't understand. I don't expect random engineers on HN to be especially crypto-literate, nor should they be: it's a super-specialized field that demands a lot of spare storage capacity in your brain, and a lot of us had enough algebra after Algebra II in 10th grade. Engineers who specialize have a whole huge variety of things to pick: machine learning, distributed systems, optimization, network algorithmics, graphics, systems security, you name it. There's no reason a significant number of people here should have to know what Noise or SPAKE2 is.

What's weird is: if you don't know what any of this stuff is, why would you feel the need to express strong opinions about it? Is it really your belief that intuition and a drive-by reading of some slides on Github page can bring you up to speed with the field? I read every "Call Me Maybe" post and I absolutely do not think I'd have a chance in hell at getting a distributed commit protocol right. Hell: I "specialize" in cryptography and feel the same way about crypto protocols!

My thoughts about Signal and Wire are that I did a good job of relating in the post you're talking about what I think about Signal and Wire.


I'm not sure what this is in response to, actually. I'm not expressing any particular opinion about cryptography. I have no opinions and I defer to experts. One thing that I have heard experts say is that we should be very hesitant to use new protocols and software in areas in which a maximum of security is needed.

So I have no opinion of Wormhole, other than to say that (1) it is new, and experts I have listened to in the past say to be wary of new approaches, and (2) it (according to its own documentation) has some rather extreme limitations that make it arguably not a good fit as a general purpose solution to encrypted file transfer.

> My thoughts about Signal and Wire are that I did a good job of relating in the post you're talking about what I think about Signal and Wire.

As I don't think I have referred to any post by you, I don't know what you think about those solutions. If you are referring to the Latacora article (I think you wrote it, but I'm not sure because it doesn't specify its authors), it says (IMO) very little about the merits of Signal and Wire compared to other systems, and nothing at all about the specific criticisms I made in my comment you originally replied to.


What experts are you listening to? Are they telling you to use PGP in 2019?


Why do you keep nitpicking tiny aspects of what I'm trying to say? GPG sucks. Everyone agrees it sucks. The consensus of experts is that you should avoid it where possible. I have not denied any of that.

The problem is that some experts love to suggest alternatives that have severe limitations that GPG (for all its very real faults) does not have.

For 1 to 1 encrypted chats, I would trust Signal provided that OWS having my phone number (and that of my conversation partner) was not a security risk, and that I'm not facing a nation state level adversary who could take over OWS servers and push a compromised version of the app to me via update mechanisms.

In my opinion those are serious limitations that GPG does not have. And this is an area in which we're talking about the most developed alternatives (other than signing). Other areas like encrypted file transfer and group chat are even worse.


With your requirements from another post in mind:

Regarding file transfer, 'age' was mentioned in a previous thread. Problem is that it hasn't been implemented yet... https://docs.google.com/document/d/11yHom20CrsuX8KQJXBBw04s8...

Regarding a secure group chat messenger with your two requirements mentioned, the consensus (among experts that have given their opinion here) seems to be that there is currently no such solution (I guess some versions of Wire could fit given that they seem to have on-premise deployment: https://wire.com/en/pricing/#pro/).


Thanks for the response. That matches my understanding of the situation. I'm not familiar with age, but I hope it turns into usable tool for some of these use cases, particularly asymmetric encryption.

Wire might eventually get there, but as far as I know they still haven't implemented federation, so (though I might be wrong) even their paid deployments would be limited to some particular network on which both conversants had accounts.


Why do you want asymmetric file encryption?

* If it's to send messages to people, you want a forward-secure ratcheting secure messenger (regardless of your message lengths or the duration of conversations).

* If it's to back things up, you want a secure backup system like Tarsnap, and even if you don't, your system's sector-level symmetric encryption also does this. Asymmetric encryption is relatively high-risk! You don't want it unless you absolutely need it!

* If it's to send files to people, you want a secure file transfer system; you're not looking to transform the file itself, but rather to establish an end-to-end secure transport for the file.

* If it's to do secure package distribution, you want a simple file signing system, and OpenBSD already nailed this: it's called signify, and the portable compatible variant of it is minisign.

* If it's a component of an application you're designing, what you want is a library that encrypts blobs, not a program that encrypts files. Use libsodium.

I'm not denying that there are cases that don't fit any of these buckets, but I'd be interested in hearing what they specifically are.

I'm a lot less interested in hearing things like "sure I want a ratcheting secure messenger but I also want to run my own server" because that's not my point. I want a use case that fundamentally demands a standalone file encryption program, not a argumentum-ad-Rube-Goldberg for why you're duct taping something together with a file encryption program to accomplish backup or package distribution or whatever.


Thank you for taking the time to give a thoughtful response, in this and your other comment. Here is what I take to be the use case - it concerns whistleblowers, who have among the most serious real world security needs.

I'm thinking of sending files to people. Large dumps of documents and other data from several GB to several TB. Imagine that any of the following apply:

* The data is in a secure location and can only be exfiltrated by the internet, not via a thumb drive. (And so my connection will be closely monitored.)

* The data needs to be sent to someone with whom I don't have another means of secure communication. In other words, I can't use symmetric encryption with a pre-shared key.

* I or my correspondent have slow or unreliable internet connections and maintaining a direct connection between each other would be difficult or impossible for the length of time needed to transfer the data.

* I or my correspondent are being closely monitored by the authorities, such that connecting directly to each other or using intermediaries that would allow us to do so is an unacceptable risk.

If any of these situations apply, I can't use Wormhole or similar approaches. I need something that will allow me to encrypt files for my correspondent without a shared secret, where I can upload those encrypted files to a service like Google Drive for them to download on their own time. This is precisely what PGP does.

I don't think it's unrealistic to imagine situations in which a source or journalist has one or more of these issues. I would very much like to see an alternative for use cases like this one. I'm in agreement with you that PGP is bad in many ways, from usability to the cryptographic implementation. But it's hard to deny that it's a remarkably robust solution to problems like this one.


I guess what I don't see here is the case where (1) you can safely upload a .PGP file to Google Drive (or anywhere across the organization's network perimeter) without immediately triggering an investigation (PGP files being particularly easy to spot) and (2) you can't use Signal and Signal's built-in file transfer.


The opposite of what you're saying is true. File transfer is especially straightforward. Group chat on secure messengers works far better than it ever did with PGP. And, incidentally, there is no 10-year rule on cryptography.


Good! So what are the solutions? What file transfer approach

(1) Doesn't require an active connection between sender and receiver over the internet for the duration of the transfer (2) Doesn't require a shared secret

What group chat secure messenger

(1) Doesn't rely on a central server operated by someone else (2) Doesn't use PII as the basis of a user's identity

These are what I regard as the most basic of requirements for each need. PGP provides them (in broken, hard to use ways); others have struggled. If you actually do have solutions here, rather than just complaints about PGP, I do want to hear them. But you've taken this thread down a rabbit hole rather than respond to my original point, which was about the limitations of the suggested alternatives.


I'm sorry for being snippy this morning. Read the /r/linux response to the PGP post to get an idea of what my baseline assumptions are about technologist disengaged with cryptography who have strong opinions about which cryptosystems people should use. But I shouldn't be projecting that on random people on HN, and I apologize for doing that to you.

My take is this (unless otherwise stated, "you" is the rhetorical "you", not you specifically):

1. PGP is bad and should be disfavored. Moreover, for some of its use cases, most especially "secure email", PGP flat-out doesn't work, and people need to be warned away from it. Most users will never see this HN thread and are getting their advice third- or fourth-hand. It's important that the "source" node for game-of-telephone graph have clear, sound advice. "Stop using PGP ASAP" is the most responsible advice I can offer.

2. The mainstream modern secure messengers are better than anything else technologists have provided to end users. If you're protecting serious things --- and the PGP team certainly claims to be doing that, so when you dip into conversations about PGP and its alternatives, you need to keep in mind that you're dipping into a game of telephone that ends with dissidents in countries like Venezuela where there are death squads --- Signal or something like it is the best you're going to do. You can be annoyed by this, and you're entitled to your opinion, but not to your own facts.

3. Nerds (I am one of those, btw) generally want something that (1) doesn't bootstrap off phone numbers and (2) lets people run their own servers. My interest in strong privacy engineering means, frankly, I don't give a shit about federation. I'm convinced that Moxie is right: federation means lowest-common-denominator protections; see Matrix, which has for years now been operating a network of off-by-default opt-in-only poorly-supported end-to-end encryption while at the same time being a darling of message board nerds. You don't have to disavow Matrix to be serious about security and privacy; I wish the project the best, too. But you can't be recommending it to immigration lawyers and foreign activists and demand to be taken seriously about security/privacy engineering (you didn't do that, but the modal "lol signal" nerd does), and I will strike down with great vengeance and furious anger anyone who tries to interpose themselves as an authoritative "source" of cryptographic advice and broadcasts to the world that Matrix is a reasonable option for those people.

4. I'm not a cryptographer, but one of my partners is, and I am myself a cryptographic vulnerability researcher (I'm comfortable assessing and breaking systems but wouldn't ask anyone to sign off on a system I designed). I feel pretty reasonably engaged with the cryptographic engineering community, which is extremely distinct from "people with strong opinions and affiliations with one or more well-known open source crypto-adjacent projects". It is "challenging" for me to hear things like (paraphrasing) "the experts I pay attention to say that you should wait a decade before using new cryptography". There is no such rule and, really, no way to escape the fact that if you're going to evaluate cryptosystems, you either need to develop the domain knowledge to do so based on intrinsic arguments, or enough domain knowledge to know whose recommendations are credible.

So my take is: I think you would have trouble coming up with a cryptography domain expert who will tell you our recommendations are "bad" (and, in fairness, one of the reasons that's the case is that our recommendations are vetted). Wormhole is fine --- pretty great, in fact. Be happy there's a system you can realistically use; it's good news. Ordinary people, like immigration lawyers, can't yet realistically use it, because it's a command line tool. I think that's going to change this year, but in the meantime, Signal does file transfers more safely than PGP does, so they should use that. Meanwhile, nerds like us can send files to people without keyservers and without giving up phone numbers. Or, if you don't have, like, access to the Internet, you can do something else.


By the way: why Signal/Wire/WhatsApp? (That being your earlier question):

* End-to-end encrypted by default, to the point that it's difficult to send plaintext.

* Radically simplified UX that doesn't meaningfully involve key management, which lawyers and activists cannot handle.

* By-default support for message log grooming / "disappearing messages".

* Modern cryptographic primitives.

* Extensively vetted cryptographic design.

* Works on phones, which are (1) the platform of choice for ordinary users, and (2) almost always more secure than desktop computers.

If you narrow the list down to Signal, you get to further add:

* Serverside privacy optimization / metadata minimization, so there's no targetable repository of all-pairs communicating parties, which is extremely valuable information for state-level adversaries.

* A commitment to not releasing features that don't square with those privacy objectives, so that for instance you can't share GIFs in the app until Moxie and his team figure out how to tunnel Giphy requests to foil traffic analysis, and you don't even get user profiles until a year or so ago, when Moxie and his team figure out how to provide them without generating a serverside database of identities.

* A multi-year track record of deep security auditing and a high-profile recipient of volunteer auditing unmatched by any other messenger.


I agree with your list of benefits. Only thing I really want to comment on (but see my response elsewhere) is

> My interest in strong privacy engineering means, frankly, I don't give a shit about federation.

I think the point of federation is that I don't know any way to get all three of the following:

1. I can run my own server, and keep my own account on it. (Last I checked, Signal couldn't realistically do that, but maybe things have changed or I was mistaken.)

2. I can communicate with one of my sources without them having to sign up for an account on my server (which may not be possible in certain conditions).

3. The protocol is not federated.

But maybe we mean slightly different things by federation.

Anyway, I'm sorry you've been targeted by people overly enthusiastic about the merits of PGP. I'm certainly not one - it's absolutely a very flawed approach.


I can see the case for a central server w.r.t. matching conversation pairs.

My objection is this: Signal (and others) seem to only support smartphone clients well. There's a genuine use-case for a more advanced user who wants to control every aspect of the platform he is using, including exactly when and what packets are being sent to whom. More importantly, the user may not want to process data on a von Neumann machine designed & administered by Apple/Google/Samsung. It would be nice if there were a supported command line interface, where the plaintext & encryption aspect of the protocol were handled separately from the ciphertext network traffic.


Is there a similar consensus about Syncthing and whether it is secure enough (or not) for file-sharing among cryptography engineers?


Look, my point is this: If I open my text editor and type some text, save the file and sign it with GPG with my private key to prove authorship, then I don't see how you are going to break this. In that specific sense GPG is not broken.

There are countless online puzzles or cryptography use cases based on this. I am asserting that since GPG is not broken in this specific sense, you can't just say GPG is broken "in its entirety".


My understanding is quite different. Email is inherently insecure and there is nothing you can do about it.

PGP is insecure for everything else as well

The Latacora article was eye-opening for me on the email problem - quite simply if I send an encrypted mail to a friend / collegue - which I intend them to read, and they read it and quote it to someone else in plain text then that's it - my plaintext and my cipher are available in the wild and my key (my long term key) is effectively broken.

I simply never thought it through that way. But that's how email is supposed to be used - it will be used that way.

Mind Blown.


Wait, what? No. Leaking plaintext doesn't reveal your long term key. I definitely didn't write that.


I finally got that when someone pointed out that the long term key is not used to encrypt the mail content - i made that leap incorrectly and went from there.

I made the change down thread - was too late to edit the original - and i hope clearly pointed out that your article did not say that

It takes me several run ups to understand most security issues and I got all excited before having my coffee that day.


Messages are encrypted with a session key in openpgp so this doesn't work.


How do I negotiate a session key with the email recipient? it's a single transmission? otherwise it's a one time pad perhaps - but then that defeats the point of the public key ?

would you mind expanding on this as it is an interesting area


A random key is used to encrypt the email, then that random key is asymmetrically encrypted using the recipient's public RSA key. You do not use your own key to encrypt mails to someone else. Indeed, you can send encrypted emails without even having a key of your own.


So I actually re-read and followed links (I mean whats wrong with lazy assumptions anymore)

tptacek's "why email is insecure" post is here: https://news.ycombinator.com/item?id=16088386

And yes thats what he says in the original latacora post.

I read the article '''invariably CC the quoted plaintext of your encrypted message to someone else (we don’t know a PGP email user who hasn’t seen this happen)'''

So I made the (incorrect) leap to pgp using the long term key to encrypt files. My bad.

But this does not fix the original point it seems - email is not going to be "secure" any time soon. But you can send encrypted files over email to people.


> So I made the (incorrect) leap to pgp using the long term key to encrypt files. My bad.

It does. Which is why PGP has no forward secrecy and if I steal your key I can decrypt all your past and future mails.


> my plaintext and my cipher are available in the wild and my key (my long term key) is effectively broken.

Wait a minute. I may have to read the Latacora article again but if we are dealing with ciphers where having a plaintext attack reveals the key, I think we're in a lot bigger trouble than I ever imagined. To be blunt I don't really believe it and it would take some explanation to convince me it's true.

Edit: OK, I think I see the problem. I believe the quote in the article is discussing the fact that the user happily quotes a message and doesn't re-encrypt it, meaning that you have accidentally leaked the plain text -- not that they key is known. So I think they are arguing that we should write apps so that it is impossible to copy the plaintext.


Since (as detailed in the reply chain) this is completely incorrect (PGP simply doesn't have this vulnerability), can you edit the comment to that effect? Otherwise this claim is a bit dangerous.


Exactly. The "PGP is bad because public key infrastructure management is hard" meme should please die already.

The idea of a dedicated package signing and encrypting tool detached from this problem is maybe not a bad idea in that regard, because it removes this stigma.


The reality of the situation is that you can't remove the human factor from security. So someone copying your email to someone else is a human problem that can't be fixed - someone could just as easily photograph the screen.

The reality is email will continue to be used, and there is a use case for being able to send an email securely to another person.

EFail was pretty bad, but only affected HTML email.

Having a modicum of backwards compatibility is how to encourage transitioning to new tech, so the RSA implementation makes sense.

I must say this is the first decent alternative I've seen for GPG instead of rants about Signal and specialised tooling that just ignore the issue that folks want to be able to send secure emails to each other.


I would love to see a sound implementation of secure e-mail, but I also believe that such a system should not be built on top of SMTP.

Signal’s cryptography seems stellar, but to me it feels a bit weird to use instant messaging as a full replacement for electronic letters. I’m guessing here, but it would probably not be impossible to build a more traditional e-mail client on top of the Signal Protocol.


I don't see why this should not be possible.

Isn't this mainly about adding a a subject metadata field an a client that just displays messages differently, enables sorting into directories and so on?

Is there a real technical difference between messaging and long form emails that I don't see?


Email is based on open, federated protocols. Every successful instant messaging service (sorry XMPP) as been a single closed provider.


Replacing email with a seperate 'secure email' system would probably work, at least until idiots write awful implementations of the secure-email protocols.


The "MITM on all HTTPS traffic in Kazakhstan" issue suggests that relying on STARTTLS for email encryption isn't that great.


To be fair e-mail is garbage trough-and-trough. You can't even use SNI, nothing cares about certificate validity, even less about Staple and CT.


How about MTA-STS? I guess that improves the situation a bit, no?


Not really. Until MTA-STS is deployed in “hard fail” mode by almost everybody it doesn’t matter.

Similarly, SPF/DKIM did not solve spam because nobody was willing to really drop incoming mail with bad or missing signatures.

Email is an “ossified” protocol. It should fade away, and be replaced with something else modern and secure like a “federated Signal”.

If that something else allows anyone to send to anyone without permission, it too will be killed by spam.


Slightly off-topic, but having the donate button placed first on the page, even before I can read what I'd be donating to, seems a bit greedy.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: