Hacker Newsnew | past | comments | ask | show | jobs | submit | tptacek's commentslogin

LWE cryptography is probably better understood now than ECDH was in 2005, when Bernstein published Curve25519, but I think you'll have a hard time finding where Bernstein recommended hybrid RSA/ECDH key exchanges.

This is extremely not settled science. Education in fact does improve IQ and we don't know how fixed intelligence is and how it responds to different environmental cues.

To which "underlying controversial question" are you referring?

Worth noting this concern over as-yet-undiscovered cryptanalytic techniques also applies to Bernstein's preferred SNTRUP.

Yes: because it took forever for curves to percolate into the WebPKI (as vs. the TLS handshake itself), and by the time they did (1) we had (esp. for TLS) resolved the "safe curves"-style concerns with the P-curves and (2) we were already looking over the horizon to PQ, and so there has been little impetus to forklift in a competing curve design.

How would NSA have "placed" a backdoor in Kyber? NSA didn't write Kyber.

He does a motte-and-bailey thing with the P-curves. I don't know if it's intentional or not.

Curve25519 was a materially important engineering advance over the state of the art in P-curve implementations when it was introduced. There was a window of time within which Curve25519 foreclosed on Internet-exploitable vulnerabilities (and probably a somewhat longer period of time where it foreclosed on some embedded vulnerabilities). That window of time has pretty much closed now, but it was real at the time.

But he also does a handwavy thing about how the P-curves could have been backdoored. No practicing cryptgraphy engineer I'm aware of takes these arguments seriously, and to buy them you have to take Bernstein's side over people like Neil Koblitz.

The P-curve backdoor argument is unserious, but the P-curve implementation stuff has enough of a solid kernel to it that he can keep both arguments alive.


Quite true, but the Dual_EC backdoor claim is serious. DJB's point that we should design curves with "nothing up my sleeve" is a nice touch.

See, this gets you into trouble, because Bernstein has actually a pretty batshit take on nothing-up-my-sleeve constructions (see the B4D455 paper) --- and that argument also hurts his position on Kyber, which does NUMS stuff!

Link?


There’s also a more approachable set of slides on the topic at https://cr.yp.to/talks/2025.11.14/slides-djb-20251114-safecu...

I tried a couple searches and I forget which calculator-speak version of "BADASS" Bernstein actually used, but the concept of the paper† is that all the NUMS-style curves are suspect because you can make combinations of mathematical constants say whatever you want them to say (in combination), and so instead you should pick curve constants based purely on engineering excellence, which nobody could ever disagree about or (looks around the room) start huge conspiracy theories over.

as I remember it


I tried to defend a similar argument in a private forum today and basically got my ass handed to me. In practice, not only would modern P-curve implementations not be "significantly weaker" than Curve25519 (we've had good complete addition formulas for them for a long time, along with widespread hardware support), but Curve25519 causes as many (probably more) problems than it solves --- cofactor problems being more common in modern practice than point validation mistakes.

In TLS, Curve25519 vs. the P-curves are a total non-issue, because TLS isn't generally deployed anymore in ways that even admit point validation vulnerabilities (even if implementations still had them). That bit, I already knew, but I'd assumed ad-hoc non-TLS implementations, by random people who don't know what point validation is, might tip the scales. Turns out guess not.

Again, by way of bona fides: I woke up this morning in your camp, regarding Curve25519. But that won't be the camp I go to bed in.


I agree that Curve25519 and other "safer" algorithms are far from immune to side channel attacks in their implementation. For example, [1] is a single trace EM side channel key recovery attack against Curve25519 implemented in MbedTLS on an ARM Cortex-M4. This implementation had the benefit of a constant-time Montgomery ladder algorithm that NIST P curve implementations have traditionally not had a similar approach for, but nonetheless failed due to a conditional swap instruction that leaked secret state via EM.

The question is generally, could a standard in 2025 build upon decades of research and implementation failures to specify side channel resistant algorithms to address conditional jumps, processor optimisations for math functions, etc which might leak secret state via timing, power or EM signals. See for example section VI of [1] which proposed a new side channel countermeasure that ended up being implemented in MbedTLS to mitigate the conditional swap instruction leak. Could such countermeasures be added to the standard in the first instance, rather than left to implementers to figure out based on their review of IACR papers?

One could argue that standards are simply following interests of standards proposers and organisations who might not care about cryptography implementations on smart cards, TPMs, etc, or side channel attacks between different containers on the same host. Instead, perhaps standards proposers and organisations only care about side channel resistance across remote networks with high noise floors for timing signals, where attacks such as [2] (300ns timing signal) are not considered feasible. If this is the case, I would argue that the standards should still state their security model more clearly, for example:

* Is the standard assuming the implementation has a noise floor of 300ns for timing signals, 1ms, etc? Are there any particular cryptographic primitives that implementers must use to avoid particular types of side channel attack (particularly timing)?

* Implementation fingerprinting resistance/avoidance: how many choices can an implementation make that may allow a cryptosystem party to be deanonymised by the specific version of a crypto library in use?[3] Does the standard provide any guarantee for fingerprinting resistance/avoidance?

[1] Template Attacks against ECC: practical implementation against Curve25519, https://cea.hal.science/cea-03157323/document

[2] CVE-2024-13176 openssl Timing side-channel in ECDSA signature computation, https://openssl-library.org/news/vulnerabilities/index.html#...

[3] Table 2, pyecsca: Reverse engineering black-box ellipticcurve cryptography via side-channel analysis, https://tches.iacr.org/index.php/TCHES/article/view/11796/11...


It's literally the ethos of the IETF going back to (at least) the late 1980s, when this was the primary contrast between IETF standards process vs. the more staid and rigorous OSI process. It's not usefully up for debate.

NSA wrote Dual EC. A team of (mostly European) academic cryptographers wrote the CRYSTALS constructions. Moreover, the NOBUS mechanism in Dual EC is obvious, and it's not at all clear where you'd do anything like that in Kyber, which goes out of its way not to have the "weird constants" problem that the P-curves (which practitioners generally trust) ended up with.

It took a couple of years to get the suspicion about Dual_EC out.

No it didn't. The problem with Dual EC was published in a rump session at the next CRYPTO after NIST published it. The widespread assumption was that nobody was actually using it, which was enabled by the fact that the important "target" implementations (most importantly RSA BSAFE, which I think a lot of people also assumed wasn't in common use, but I may just be saying that because it's what I myself assumed) were deeply closed-source.

None of this applies to anything else besides Dual EC.

That aside: I don't know what this has to do with anything I just wrote. Did you mean to respond to some other comment?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: