Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
NIST Post-Quantum Cryptography Round 1 Submissions (nist.gov)
130 points by sohkamyung on Dec 27, 2017 | hide | past | favorite | 48 comments


If you can tolerate ~1MB private keys, Classic McEliece is probably the best conservative and secure post-quantum option just now before others are analyzed more. It's old and proven.

>9 Advantages and limitations (2.B.6)

>The central advantage of this submission is security. See the design rationale.

>Regarding efficiency, the use of random-looking linear codes with no visible structure forces public-key sizes to be on the scale of a megabyte for quantitatively high security: the public key is a full (generator/parity-check) matrix. Key-generation software is also not very fast. Applications must continue using each public key for long enough to handle the costs of generating and distributing the key.

>There are, however, some compensating efficiency advantages. Encapsulation and decapsu- lation are reasonably fast in software, and impressively fast in hardware, due to the simple nature of the objects (binary vectors) and operations (such as binary matrix-vector multiplications). Key generation is also quite fast in hardware. The hardware speeds of key generation and decoding are already demonstrated by our FPGA implementation. Encapsulation takes only a single pass over a public key, allowing large public keys to be streamed through small coprocessors and small devices.

>Furthermore, the ciphertexts are unusually small for post-quantum cryptography: under 256 bytes for our suggested high-security parameter sets. This allows ciphertexts to fit comfortably inside single network packets. The small ciphertext size can be much more important for total traffic than the large key size, depending on the ratio between how often keys are sent and how often ciphertexts are sent. System parameters can be adjusted for even smaller ciphertexts.


Can't really see standardization settling on something that requires ~1,000-10,000x overhead from its competitors. 1mb public keys would mean things like IoT devices and block chain type usages, not to mention local caching taking possibly gigabytes.

Hybrid secret-sharing schemes do a lot to mitigate both future quantum risk and present broken cipher system risk.


One of them has already fallen victim to a key recovery attack:

https://twitter.com/yx7__/status/945283780851400704

Round2 (LWE-based) and SIKE (isogeny-based) are the particularly interesting ones to me. Both support comparatively small keys (~1kB), with Round2 seemingly winning on performance, but also patented.


I never understood how or why patenting math is a thing.


It is patenting the application of math.


How do you differentiate between math and the "application of math"?


You can't patent an S-box transform. You can patent using a specific S-box transform to encrypt / decrypt data. That's the justification at least.


More like you can patent math when it is applied via a computer. So you would not be violating the patent by evaluating the S-box with pen and paper.

It makes no particular sense, but the theory behind math patents being out of bounds is that math is a fact. Software patents, including crypto patents, do not "feel" like facts to most judges, even if they technically are (see: Curry-Howard Correspondence).


I like to ask dumb questions... So here is one: Are forward secrecy guarantees made by TLS valid in a post-quantum world?

Specifically, say I connect via TLS to a server that uses FS and exchange some information that will be valuable in 150 years. If the entire conversation is recorded at the wire, does quantum computing break this promise? (or have I misunderstood FS completely?)


In general forward security involves generating a new key for each conversation with Diffie-Hellman or similar.

According to https://crypto.stackexchange.com/questions/5610/diffie-hellm..., Diffie-Hellman is not post-quantum secure, but https://en.wikipedia.org/wiki/Supersingular_isogeny_key_exch... is a DH alternative that is.


Which is partly why we wrote and open sourced a SIDH implementation in TLS 1.3: https://blog.cloudflare.com/sidh-go/


I <3 cloudflare, you guys are awesome!


Do you have any performance numbers for SIDH vs ECDHE?


Cloudflare has a Go implementation of SIDH with p751: https://blog.cloudflare.com/sidh-go/

Here’s an overview of the performance from a patch by Armando Faz Hernandez: https://github.com/cloudflare/p751sidh/pull/2


https://www.fredericjacobs.com/blog/2016/04/07/qc-axolotl/ might be a good place to start answering this question.

If you can break the ECDH, you can figure out the key that was calculated. However, if other unknown data (e.g. past ECDH outputs) is hashed together to generate the key (like Signal does, as a very rough description), it's only post-quantum secure if the adversary ever misses a single message.

Other than weird saved-by-the-bell corner cases, the answer is "No".


I wrote a blog post about this. tl;dr: no, not unless you were using PQ hard algorithms

https://tonyarcieri.com/imperfect-forward-secrecy-the-coming...


>Are forward secrecy guarantees made by TLS valid in a post-quantum world?

No, forward secrecy doesn't apply if the underlying public-key cryptography gets broken (such as by a quantum computer).


I'm seriously considering moving to pre-shared keys. Vernor Vinge's books (_A Deepness In The Sky_, others) talk about how much of interstellar shipping is the secure transportation of random pre-shared keys.

Although I'm unfamiliar w/ anything more sophisticated than a password manager like 1Password and S/Key - I'd like to see if I could have a go with this for my personal kit.

Can anyone recommend keywords/papers that talk about pre-shared key authentication at automated scale?


PSK isn't really a good solution, mostly because of scaling problems.

Today, all you need is a 32-byte ECDH private key, and a corresponding 32-byte public key. To talk to any of the billions of devices on the Internet securely, all you need is their 32-byte public key, and some means to verify that it is authentic (i.e. certificate authorities + certificate transparency). With this information, you can calculate a shared secret (or use a sealing API given only their public key, if you really want).

Total local storage: PKI overhead plus your own keys, under a few megabytes.

Doing PSK at scale means storing billions of 32-byte binary strings. One for each participant you intend to communicate with. They must also do this for every possible device they wish to communicate with, otherwise others can just eavesdrop on their conversations if keys are reused.

You've turned an O(n) space problem into an O(n^2) space problem.

If your n is small, you might still be tempted to make the move, but it will quickly become unmanageable as n grows.

Furthermore, if a devices loses their key, you must:

1. Generate/share a new keypair with n devices (which at Internet scale is billions), and

2. Do so in a way that doesn't open the door for backdoors, which probably requires digital signatures and therefore you're not actually getting rid of public-key cryptography

I think at Internet scale, we're still years away from having reliable post-quantum cryptography to even recommend let alone adopt, but that's going to be the way to go. Something something baby and bathwater.


Passwords are PSKs and are the overwhelming default authentication mechanism for Internet scale systems.

Storing, distributing, and synchronizing billions of small blobs is actually a solved problem and not that big a deal. I wouldn't be so quick to write it off, because it has a real, working, easily understood invalidation and revocation story, which PKI does not.

A system that combines asymmetric crypto public-private key pairs with the distributed-whitelist properties of simple passwords, by treating the public keys as PSKs, has a lot of compelling benefits.


> Passwords are PSKs and are the overwhelming default authentication mechanism for Internet scale systems.

Not for HTTPS, which is what we're talking about.

For WPA2-PSK, okay, you have a small n, it might work.

For user authentication against a local database, you're not encrypting, you're using validating against a password hash and then elevating the user's privilege for the duration of their session. That's far removed from what we mean when we say PSK (pre-shared keys), where encryption is implied.


To be fair, that’s what you’re talking about. I am mostly talking about SSH access.


Passwords are largely stored in servers with enough storage to deal with the users of them.

However, distributing a PSK to every device is not friendly. 10 Million Passwords will at minimum take about 80 MB, some phones and IoT devices have only ten times of that as permanent storage or less.

Passwords work mostly because the other side has the storage and cpu capacity to operate with them


Are you aware of what forms of asymmetric crypto are broken by quantum computing?


That's not N^2 it's N*k where k is number of devices and is generally very low. Most people will have at most 3 or 4 devices. You could also have a single key and store it on a wearable device, like a phone or watch, and we're back down to a single key per person.

I think the bigger issue with PSK at scale is key generation and distribution.


It's n^2 across all devices. You don't use the same key between two different pairs of parties.


In practice you would probably use something like Kerberos to avoid the quadratic overhead.


As long as the AES key size is 256 and not one of the 128 versions.


Ohh, I see what you're saying.


Actually, he ships around giant blocks of random numbers in a k of n format to be reassembled as a one-time pad.


Thank you for elucidating


Does NIST require submissions to be free (as in open software)?

Or is this going to be a bunch of advertising for proprietary patented algorithms?


You can read their Intellectual Property statements here. https://csrc.nist.gov/CSRC/media/Projects/Post-Quantum-Crypt.... They will allow patents but their goal seems to be a royalty-free implementation.


This is to form PQC (post quantum crypto) algorithms driven by the industry. In practice, as long as quantum computers don't both work and become scalable in the interim, you have another 4 years to worry about recommendations.


That's not really true though is it? What about any of my data that is encrypted today with a non-PQC algorithm and has been intercepted? If quantum computers become functional and scalable while that data is still of value to me then I need to worry.


Yes, it's true. I spoke about future pqc recommendations from NIST at a scheduled time. I never said there was nothing to worry about, but that NIST's future pqc recommendations don't need to be worried about today.

You are making a reference to something other than the pqc future recommendations from NIST. Of course, if the agility is low, requiring a large ramp up time, a company can turn to the CNSA values today.

As you point out, working scalability is the issue. Even when there is a computer than can run grover or schor over something, the number of qubits will be the issue.


Are there any blogs that talk about what’s required in a crypto algorithm to withstand quantum computing? Is a straightforward increasing the key size enough? Or are there new paradigms explored? Are these algorithms symmetric or asymmetric?


PQ crypto has to be secure against a scalable quantum computer i.e. one that can be increased to attack arbitrarily large keysizes. Building a quantum computer with enough qbits to attack RSA2048 is not a problem if there is no known way to increase the number of qbits to attack RSA3072 -- but if you can keep adding more qbits, then we need a post-quantum algorithm.

The interest in post-quantum crypto is mainly in public-key systems, because for symmetric key systems you can essentially double the key size to achieve the same security level. In other words, AES256 is as secure against a quantum computer as AES128 is against a classical computer. On the other hand, quantum algorithms break RSA and DLOG (include ECDLOG), so the research has been on new approaches to public key crypto that seem hard for quantum computers. For example, the Learning With Errors (LWE) problem:

https://en.wikipedia.org/wiki/Learning_with_errors


Interesting. But an ec 512 bit key has a theoretically same strength as an aes 256 bit key. How is aes able to withstand it when ec and rsa can’t?


AES's security reduction in the quantum scenario comes from a sqrt(n) reduction in search time. It's an optimized brute force against the key.

ECC gets broken because there is a quantum algorithm that solves the Elliptic Curve Discrete Logarithm Problem much faster than sqrt(n).


Thanks. Looks like I have a wrong understanding of quantum computing. I thought QC increases the computing power by a lot there by making it easier or practical to compute the discrete log. So are you saying that the QC paradigm lends itself to a different class of algorithms?


Yes, look up Shor's Algorithm and Grover's Algorithm.


This paper lists a few of the practical concerns for quantum-resistant algos (and proposes an algo that wasn't submitted to NIST Post-Quantum Cryptography Round 1):

"Quantum attacks on Bitcoin, and how to protect against them" https://arxiv.org/abs/1710.10377 (~2027?)

A few Quantum Computing and Quantum Algorithm resources: https://news.ycombinator.com/item?id=16052193

Responsive HTML (arxiv-vanity/engrafo, PLoS,) or Markdown in a Jupyter notebook (stored in a Git repo with a tag and maybe a DOI from figshare or Zenodo) really would be far more useful than comparing LaTeX equations rendered into PDFs.


This seems to have generated a lot of collaboration.

DJB is listed on 5 separate submissions.

Phillippe Gaborit is on 8!


If I were you, I wouldn't be so quick to trust any cryptography standard set by a government agency. They have an obvious incentive to ensure that any standard that is eventually set be open to attack by whatever means the NSA has (or will have in the near future.)

So pay careful attention to things they reject, or things they modify!

I have no insider knowledge here. I'm just applying common sense...



I guess we'll leave this one up since that one didn't attract discussion while this one has.


I sure hope we can get a post-quantum version of Dual_EC_DRBG from the trustworthy people at NIST. /s

Not that I especially think any of these algorithms are going to be backdoored (though I certainly hope they get extra scrutiny just in case). I'm mostly just disappointed that there aren't any real consequences for a group like NIST for rubber-stamping backdoored things.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: