> I believe the implication that NIST or NSA somehow bribed one of the PQC researchers to weaken a submission is risible.
Is that even a claim here? I'm on mobile right now so it's a bit hard for me to trawl through the DJB/NIST dialogue, but I thought his main complaint is that NIST didn't appear to have a proper and clear process for choosing the algorithms they did, when arguably better algorithms were available.
So the suggestion wouldn't necessarily be that one of the respected contestants was bribed or otherwise compromised, but rather that NIST may have been tapped on the shoulder by NSA (again) with the suggestion that they should pick a specific algorithm, and that NSA would make the suggestion they have because their own cryptographers ("true believers" on NSA payroll) have discovered flaws in those suggested algorithms that they believe NSA can exploit but hopefully not adversaries can exploit.
There's no need for any novel conspiracies or corruption; merely an exact repeat of previous NSA/NIST behaviour consistent with NSA policy positions.
It's simultaneously about as banal as it gets, and deeply troubling because of that.
I guess I'm not reading it that way. In fact, a FOIA request is going after official records, which I wouldn't expect would contain outright bribery.
Yes, DJB brings up their known bribing of RSA wrt to the whole Dual-EC thing. But my read of that bit of info was the more general 'here's evidence that the NSA actively commits funding towards infecting standards' rather than 'the NSA's playbook just contains outright bribery and that's what we expect to find in the FOIA requests given to NIST'.
You don’t get it clearly. They’re playing dirty. At best the FOIA will receive a document made on the fly with nothing of value. The rules don’t apply to the NSA. You can do exactly nothing. But NIST, you can do something about - reject any standard they approve. It’s your choice what algorithm you use, and we know NIST will select a broken algorithm for the NSA, so just ignore their ‘standard’.
The best solution is using layers of crypto, trusting no single algorithm.
"You shouldn't fight because the baddies are strong!" is a horrible argument in my book. Discouraging and disparaging other people's attempts is even worse.
The actual claim is that NSA may have already spent a lot of time and effort to analyse PQC algorithm underlying problems without making their findings public.
DJB seems to suspect that they may influence NIST to select algorithms and parameters within the range of what they already know how to break.
Huh? Of course NSA spent a lot of time and effort analyzing algorithms without making their findings public. That is their literal job. The peer review NIST is refereeing happened in the open. When people broke SIDH, they didn't whisper it anyone's ear: they published a paper. That's how this stuff works. Bernstein doesn't have a paper to show you; all he has is innuendo. How you know his argument is as limp as a cooked spaghetti noodle is that he actually stoops to suggesting that NSA might have bribed one of the members of the PQC teams.
If he had something real to say, he wouldn't have embarrassed himself like that. How I think I know that is, I think any reasonable person would go way out of their way to avoid such an embarrassing claim, absent extraordinary evidence, of which he's presented none.
It is very hard to not take this comment as being made in bad faith. You either are willfully ignorant or have ulterior motives.
It is a matter of public record (as also detailed again in the article) that the NSA colluded with NIST to get weakened crypto standardised. This happened not only once but multiple times and they when weaknesses become known have repeatedly (and against better knowledge) downplayed the impact. This is undisputed. After the Dual EC skandal they promised that they would be more transparent in the future. DJB alleges that there is important information missing on the decision making processes in the most recent PQC discussion (I am willing to trust him on that, but if you are an expert in the field I'm sure you can elaborate here why it is incorrect). That's why did an FOI request which has not been answered and he is now filing a lawsuit over.
I would argue based on past behaviour we should trust DJB much more than either the NSA or NIST, but it seems you are more occupied with unsubstantially attacking his person than getting to the truth.
> he actually stoops to suggesting that NSA might have bribed one of the members of the PQC teams
I don't know anyone in the teams to judge their moral fiber, but I'm 100% sure the NSA is not above what is suggested and your weird outrage at the suggestion seems surprising knowing what is public knowledge about how the NSA operates.
There are arguments here about NSA pressure on NIST. You miss the point because apparently you're offended that someone suggested your friends can be bribed. I mean, maybe they can't, but this is about the NSA being corrupt, not the researchers.
It can be everybody involved. It should include NIST based on the history alone.
Some of the commentary on this topic is by people who also denied DUAL_EC until (correctly) conceding that it was actually a backdoor, actually deployed, and that it is embarrassing for both NSA and NIST.
This sometimes looks like reactionary denialism. It’s a safe position that forces others to do a lot of work, it seems good faith with some people and not so much with others.
I'm people who denied that Dual EC was a backdoor (my position wasn't an unusual one; it was that Dual EC was too stupid to actually use, which made it an unlikely backdoor). Dan Bernstein didn't educate me about that; like anybody else who held that position, the moment I learned that real products in the industry were built with libraries that defaulted to Dual EC, the jig was up.
I'm honest about what I'm saying and what I've said. You are not meeting the same bar. For instance, here you're insinuating that my problem on this thread is that I think NIST is good, or trustworthy, or that NSA would never have the audacity to try to bribe anybody. Of course, none of that is true.
I don't know how seriously you expect anybody to take you. You wrote 13-paragraph comment on this thread based on Filippo's use of an "It's Always Sunny In Philadelphia Meme", saying that it was a parody of "A Beautiful Mind", which is about John Nash, who was mentally ill, and also an anti-semite, ergo Filippo Valsorda is an anti-semite who punches down at the mentally ill. It's right there for everybody to read.
How could any serious security researcher have been in doubt about Dual EC? The design did not not make any sense at all. Not until you consider that it is designed with a back door, then it is a sleek minimal design that does exactly what it needs to do and not a whole lot more.
If you couldn't see that from a mile away, then you might be too naive to work in security.
> I believe the implication that NIST or NSA somehow bribed one of the PQC researchers to weaken a submission is risible.
Could you elaborate on this? I didn't get this from the article at all. There's no researcher(s) being implicated as far as I can tell.
What I read is the accusation of NIST's decision-making process possibly being influenced by the NSA, something that we know has happened before.
Say N teams of stellar researchers submit proposals, and they review their peers. For the sake of argument, let's say that no flaw is found in any proposal; every single one is considered perfect.
NIST then picks algorithm X.
It is critical to understand the decision making process behind the picking of X, crucially so when the decision-making body has a history of collusion.
Because even if all N proposals are considered perfect by all possible researchers, if the NSA did influence NIST in the process, history would suggest that X would be the least trustable of all proposals.
And that's the main argument I got from the article.
Yes, stone-walling a FOIA request may be common, but in the case of NIST, there is ample precedent for malfeasance.
I don't even support NIST's mission; even if you assembled a trustworthy NIST, I would oppose it.
The logical problem with the argument Bernstein makes about NSA picking the least trustworthy scheme is that it applies to literally any scheme NIST picks. It's unfalsifiable. If he believes it, his FOIA effort is a waste of time (he cannot FOIA NSA's secret PQC attack knowledge).
The funny thing here is, I actually do accept his logic, perhaps even more than he does. I don't think there's any reason to place more trust in NIST's PQC selections than other well-reviewed competing proposals. I trust the peer review of the competitors, but not NIST's process at all.
> The logical problem with the argument Bernstein makes about NSA picking the least trustworthy scheme is that it applies to literally any scheme NIST picks. It's unfalsifiable.
That may be true in the strict sense, but in practice, I think there would be a material distinction between a NIST process of "we defer our decision to the majority opinion of a set of three researchers with unimpeachable reputations" (a characterization from another comment) and a process of "NSA said we should pick X."
In the strict sense, I can't trust either process, but in practice [edit: as an absolute layperson who has to trust someone], I'd trust the first process infinitely more (as I would absolutely distrust the second process).
> The funny thing here is, I actually do accept his logic, perhaps even more than he does.
That's actually what I got from your other comments to this story. But that confused me, because it was also what I got from the article. The first two thirds of the article are spent entirely on presenting NIST as an untrustworthy body based on decades of history. Apart from the title, PQC isn't even mentioned until the last third, and that part, to me, was basically "NIST's claims of reform are invalidated if it turns out that NSA influenced the decision-making process again".
My vibe was that both of your positions are more or less in agreement, though I have to say I didn't pick up on any accusations of corruption of a PQC researcher in the article (I attribute that to me being a layperson in the matter).
I believe you have a very naive and trusting view of these US governmental bodies. I don't intend that to be an insult, but by now I think the jury is out that these agencies cannot be trusted (the NSA less so, than NIST).
I'm not sure about corrupting NIST nor corrupting individual officials of NIST, but I can easily imagine NIST committees not understanding something, being tricked, not looking closely, protecting big orgs by default (without maliciousness), and overall being sloppy.
Running standards without full transparency, in my experiences of web security standards + web GPU standards is almost always due to hiding weaknesses, incompetence, security gaps of big players, & internal politics of these powerful incumbents. Think some hardware vendor not playing ball without guarantee of privacy, some Google/Apple committee member dragging their feet because of internal politics & monopoly plays. Seperately, mistakes may come from standards committee member glossing over stuff in emails because they're busy: senior folks are the most technically qualified yet also most busy. Generally not because some NSA/CIA employee is telling them to do something sneaky or lying. Still FOIA-worthy (and why I rather public lists for standards), but for much lamer reasons.
> ...but I can easily imagine NIST committees not understanding something, being tricked, not looking closely, protecting big orgs by default (without maliciousness), and overall being sloppy.
I agree with this. And I think that this is more likely to be the case. But I really think with all that we now know about US governmental organisations the possibility of backdoors or coercion should not be ruled out.
Even when you're trying to be charitable, you're wildly missing the point. I don't give a fuck about NIST or NSA. I don't trust either of them and I don't even buy into the premise of what NIST is supposed to be doing: I think formal cryptographic standards are a force for evil. The point isn't that NIST is trustworthy. The point is that the PQC finalist teams are comprised of academic cryptographers from around the world with unimpeachable reputations, and it's ludicrous to suggest that NSA could have compromised them.
The whole point of the competition structure is that you don't simply have to trust NIST; the competitors (and cryptographers who aren't even entrants in the contest) are peer reviewing each other, and NIST is refereeing.
What Bernstein is counting on here is that his cheering section doesn't know the names of any cryptographers besides "djb", Bruce Schneier, and maybe, just maybe, Joan Daemen. If they knew anything about who the PQC team members were, they'd shoot milk out their nose at the suggestion that NSA had suborned backdoors from them. What's upsetting is that he knows this, and he knows you don't know this, and he's exploiting that.
My reading wasn't that he thinks they built backdoors into them, but that the NSA might be aware of weaknesses in some of them, and be trying to promote the algorithms they know how to break.
Peer review and "informal standards". Good examples of things that were, until long after their widespread adoption, informal standards include Curve25519, Salsa20 and ChaCha20, and Poly1305. A great example of an informal standard that remains an informal standard despite near-universal adoption is WireGuard. More things like WireGuard. Less things like X.509.
Both formal and informal peer review are why I like the FOIA, and standards / competition discussion to be open in general. I actually dislike closed peer review, or at least without some sort of time-gated release.
Likely scenarios, and that closed review hides:
- Peer review happened... But was lame. Surprisingly common, and often the typical case.
- If some discussion did come up on a likely attack... What? Was the rebuttal and final discussion satisfactory?
It's interesting if some gov team found additional things... But I'm less worried about that, they're effectively just an 'extra' review committee. Though as djb fears, a no-no if they ask to weaken something... And hence another reason it's good for the history of the alg to be public.
Edit: Now that storage and video are cheap, I can easily imagine a shift to requiring all emails + meetings to be fully published.
Edit: I can't reply some reason, but having been an academic reviewer, including for security, and won awards for best of year/decade academic papers, I can say academic peer review may not be doing what most people think, eg, it is often more about novelty and trends and increments from a 1 hour skim. Or catching only super obvious things outsiders and fresh researchers mess up on. Very diff from say a yearlong $1M dedicated pentest. Which I doubt happened. It's easy to tell which kind of review happened when reading a report... Hence me liking a call for openness here.
You get that the most important "peer review" in the PQC contest took the form of published academic research, right? NIST doesn't even have the technical capability to do the work we're talking about. My understanding is that they refereed; they weren't the peer reviewers.
Replying to your edit I've been an academic peer reviewer too. For all of its weaknesses, that kind of peer review is the premise of the PQC contest --- indeed, it's the premise of pretty much all of modern cryptography.
As much as I like the design of WireGuard, the original paper made stronger claims of security than were achieved with respect to key exchange models. Peer review and informal standards failed in catching this. From my perspective, the true benefit of a formal standardisation process such as this is that it dangles such a publishable target in front of researchers that we formally verify/disprove these claims out in the open.
WireGuard's design is superior to that of its competitors, and one of its distinctive features is that it lacks formal standardization. It's not as if we don't have decades of experiences with attempts to standardize our way into strong cryptography; see IPSEC for a particularly notorious example of how badly standards processes handle this stuff.
For sure, if a standardization process had been called to design a VPN protocol, I'd agree that the resulting design would almost certainly be less than WireGuard. I think that the competitive nature of the PQC process as well as soliciting completed submissions as opposed to a process to build from the ground-up helps in this regard. I don't think that engages with the point I was making, however: the original submission of WireGuard made claims that were incorrect, which would have arguably been caught sooner if it were a part of a formal standardization process, since researchers would have been incentivized to analyse it sooner.
Having come from a community that is often cleanup duty for unfounded claims (PL) and having to spend ~decade+ $100M+ efforts to do so... I didn't realize that about wireguard. That's pretty strange to read in 2022.
To be clear, WireGuard is a good VPN protocol, and definitely a secure design. I wouldn't recommend another over it. It's just the initial claims of security in the NDSS paper were incompatible with its design.
I'm sure it's a pretty good one, but it's quite hard to trust more than that both on the design + impl side if you ever have tried to verify (vs just test) such a system. Think the years of pain for something much more trivial like paxos + an impl of it.
In this case, looks like the community does value backing up its claims, and the protocol is verified: https://www.wireguard.com/formal-verification/ . Pretty awesome! The implementation itself seems to be written unsafely, so TBD there.
You're probably right about my original comment, and I apologize. These threads are full of very impassioned, very poorly-informed comments --- I'm not saying I'm well-informed about NIST PQC, because I'm not, but, I mean, just, wow --- and in circumstances like that I tend to play my cards very close to my chest; it's just a deeply ingrained message board habit of mine. I can see how it'd be annoying.
I spent almost 2 decades as a Daniel Bernstein ultra-fan --- he's a hometown hero, and also someone whose work was extremely important to me professionally in the 1990s, and, to me at least, he has always been kind and cheerful; he even tried to give us some ideas for ECC challenges for Cryptopals. I know what it's like to be in the situation of (a) deeply admiring Bernstein and (b) only really paying attention to one cryptographer in the world (Bernstein).
But talk to a bunch of other cryptographers --- and, also, learn about the work a lot of other cryptographers are doing --- and you're going to hear stories. I'm not going to say Bernstein has a bad reputation; for one thing, I'm not qualified to say that, and for another I don't think "bad" is the right word. So I'll put it this way: Bernstein has a fucked up reputation in his field. I am not at all happy to say that, but it's true.
Based only on random conversations and no serious interrogation of what happened, so take it for the very little this pure statement of opinion is worth, I'd say he has, chiefly, and in my own words, a reputation for being a prickly drama queen.
He has never been that to me; I've had just a few personal interactions with him, and they've been uniformly positive. My feeling is that he was generous with his time and expertise when I had questions, and pleasant and welcoming in person.
He has, in the intervening years, done several things that grossed me the fuck out, though. There are certainly people who revel in hating the guy. I'm not one of them.
> If they knew anything about who the PQC team members were, they'd shoot milk out their nose at the suggestion that NSA had suborned backdoors from them.
> the motivation behind those requests is risible.
It is quite hilarious that NIST suckered the industry into actually using Dual-EC, despite being worse than the other possible choices in nearly every respect. And this ignores the fact that the backdoor was publicly known for years. This actually happened; it’s not a joke.
The motivation behind the FOIA requests is to attempt to see whether any funny business is going on with PQ crypto.
If the NSA actually suckers any major commercial player into using a broken PQ scheme without a well-established classical scheme as a backup, that will be risible too.
Dual_EC keeps getting brought up, but I have to ask: does anybody have any real evidence that it was widely deployed? My recollection is that it basically didn't appear anywhere outside of a handful of not-widely-used FIPS-certified libraries, and wasn't even the default in any of them except RSA's BSAFE.
The closest thing we have to evidence that Dual_EC was exploited in the wild seems to be a bunch of circumstantial evidence around its role in the OPM hack which, if true, is much more of a "self own" than anything else.
It was widely deployed. NSA got it into BSAFE, which I would have said "nobody uses BSAFE, it's not 1996 anymore", but it turned out a bunch of closed-source old-school hardware products were using BSAFE. The most notable BSAFE victims were Juniper/Netscreen.
Everybody who claimed Dual EC was a backdoor was right, and that backdoor was materially relevant to our industry. I couldn't believe something as dumb as Dual EC was a real backdoor; it seemed like such idiotic tradecraft. But the belief that Dual EC was so bad as tradecraft that it couldn't be real was, apparently, part of the tradecraft! Bernstein is right about that (even if he came to the conclusion at basically the same time as everyone else --- like, the instant you find out Juniper/Netscreen is using Dual EC, the jig is up).
I don't think Juniper used BSAFE in ScreenOS -- they seem to have put together their own Dual EC implementation on top of OpenSSL, sometime around 2008. (This doesn't change your point, of course.)
Yeah, I think you're right; the Juniper revelation also happened months after the BULLRUN stuff --- I remember being upset about how Greenwald and his crew had hidden all the Snowden docs in a SCIF to "carefully review them", with the net result that we went many months without knowing that one of the most popular VPN appliances was backdoored.
ECDSA is almost universally used. It's deeply suboptimal in a variety of ways. But that's because it was designed in the 1990s, not because it's backdoored. This isn't a new line of argumentation for Bernstein; he has also implied that AES is Rijndael specifically because it was so commonly implemented with secret-dependent lookups (S-boxes, in the parlance); he's counting on a lay audience not knowing the distinction between an engineering principle mostly unknown at the time something was designed, and a literal backdoor.
What's annoying is that he's usually right, and sometimes even right in important new ways. But he runs the ball way past the end zone. Almost everybody in the field agrees with the core things he's saying, but almost nobody wants to get on board with his wild-eyed theories of how the suboptimal status quo is actually a product of the Lizard People.
Is he claiming that it is a literal backdoor though? Couldn't Bernstein have a point that the NIST picked Rijndael as the winner of the AES competition because the way it was usually implemented was susceptible to timing attacks? Even if it the engineering principle was mostly unknown at the time, one might guess that e.g. NSA was aware of it and may have provided some helpful feedback.
> he's counting on a lay audience not knowing the distinction between an engineering principle mostly unknown at the time something was designed, and a literal backdoor.
When you discount his theories with that argument, your own reductio ad Lizardum (?) doesn’t help. There’s a world of distinction between NSA inserting backdoors, for which there’s good evidence but maybe not every time, and whatever you’re trying to paint his theory as by invoking the Lizard People.
You haven't explained how my argument discounts his theories. You're just unhappy that I used the term "Lizard People". Ok: I retract "Lizard People". Where does that leave your argument?
I don't care about his theories. What matters that US export controls on encryption were reduced due to his previous lawsuit and he has offered alternative encryption in the public domain.
> I believe the implication that NIST or NSA somehow bribed one of the PQC researchers to weaken a submission is risible.
maybe you don't know what risible means, but it reads like you're saying that the NSA "somehow" coercing someone is unlikely, which i'm sure you can agree is a "very naive and trusting view"
Nowhere does the comment say that the NSA "somehow" coercing someone is unlikely. Hence, it's fair question whether the comment had been comprehended, because it seems it hasn't in this thread. If comprehension begets intelligence than conclusions born from misunderstanding exude stupidity.
And, dropping the pedantry, it's quite frustrating to be deliberately or casually or in whatever way misrepresented by drive-by commenters in an otherwise apt discussion thread. Your comment and the one tptacek responded to are patronizing and dismissive and really don't contribute to any interesting discourse on the topic. I think it's fair to dismiss stupid drive-by low-effort quips, personally.
You used obscure language to make yourself look smart and deal with the resulting confusion by calling people stupid instead of clarifying what was said. Please get your ego in order.
The person is saying one thing then denying saying that thing and being a jerk about it. Either a bot or someone with a broken thesaurus. Glad you pointed it out because it’s ridiculous/risible.
That person is very well known in this community, and in other communities as well.
They are also known for making very specific arguments that people misinterpret and fight over, but the actual intent and literal meaning of the statements is most often correct (IMO).
Whether this is a byproduct of trying to be exacting in the language used that tends to cause people interpretive problems or a specific tactic to expose those that are a combination of careless with their reading and willing to make assumptions rather than ask questions is unknown to me, but that doesn't change how it tends to play out, from my perspective.
In this case, I'll throw you a bone and restate his position as I understand it.
NIST ran the competition in question in a way such that all the judges referred each other, and all are very well known in the cryptographic field, and the suggestion by someone with more common game that they could be bribes in this manner (note not that the NSA would not attempt it, but the implication they would succeed with the people in question) is extremely unlikely, and that DJB would suggest as much knowing his fame may matter to people more than the facts of who these people are, is problematic.
I'm not sure I'd use the same words, but yeah, the argument I'm refusing to dignify is that NSA could have been successful at bribing a member of one of the PQC teams. Like, what is that bribed person going to do? Look at the teams; they're ridiculously big. It doesn't even make sense. Again: part of my dismissiveness comes from how clear it is that Bernstein is counting on his cheering section not knowing any of this, even though it's a couple of Google searches away.
One trivial example implied by the blog post: Such corruption could be involved in the non-transparent decision making process at NIST.
Regarding Dual_EC: we still lack a lot of information about how this decision was made internally at NIST. That’s a core point: transparency was promised in the wake of discovered sabotage and it hasn’t arrived.
What do you mean, "how" the decision about Dual EC was made? It's an NSA-designed backdoor. NIST standardized it because NSA told them to. I'm sure NSA told NIST a story about why it was important to standardize it. The Kremlinology isn't interesting: it is NSA's chartered job to break cryptography, and nobody should ever trust them; the only thing NSA can do to improve cryptography is to literally publish secret attacks, and they're not going to do that.
What do I mean? Iran-Contra, Watergate, or a 9/11 report style report, like levels of investigation. Given how widely read the BULLRUN stories were, it’s not credible to suggest the details aren’t important.
The American people deserve to know who picked up the phone or held a meeting to make this happen. Who was present, who at NIST knew what, and so on. Who internally had objections and indeed who set the policy in the first place. What whistleblower protections were in place and why didn’t the IG have involvement in public? Why did we have to learn about this from Snowden?
NSA has a dual mandate, on that I hope we can agree. It’s my understanding that part of their job is to secure things and that part of their job is to break stuff.
NIST has no such dual mandate, heads should roll at NIST. We probably agree that NSA probably won’t be accountable in any meaningful sense, but NIST must be - we are stuck with them. Not trusting them isn’t an option for anyone who files their taxes or banks or does any number of other regulated activities that require using NIST standards.
If that is the case, then what is the explanation for NIST (according to DJB) 1. not communicating their decision process to anywhere near the degree that they vowed to, and 2. stone-walling a FOIA request on the matter?
> Whether this is a byproduct of trying to be exacting in the language used that tends to cause people interpretive problems or a specific tactic to expose those that are a combination of careless with their reading and willing to make assumptions rather than ask questions is unknown to me
Communicating badly and then acting smug when misunderstood is not cleverness (https://xkcd.com/169/).
If many people do not understand the argument being made, it doesn't matter how "exacting" the language is - the writer failed at communicating. I don't have a stake in this, but from afar this thread looks like tptacek making statements so terse as to be vague, and then going "Gotcha! That's not the right interpretation!" when somebody attempts to find some meaning in them.
In short: If standard advice is "you should ask questions to understand my point", you're doing it wrong. This isn't "HN gathers to tease wisdom out of tptacek" - it's on him to be understood by the readers (almost all of which are lurkers!). Unless he doesn't care about that, but only about shouting (what he thinks are) logically consistent statements into the void.
The explanation for the FOIA process is that public bodies routinely get intransigent about FOIA requests and violate the statutes. Read upthread: I have worked with Bernstein's FOIA attorneys before. Like everyone else, I support the suit, even as I think it's deeply silly for Bernstein to equate it to Bernstein v US.
If you made me guess about why NIST denied his FOIA requests, I'd say that Bernstein probably royally pissed everyone at NIST off before he made those requests, and they denied them because they decided the requests were being made in bad faith.
But they don't get to do that, so they're going to be forced to give up the documents. I'm sure when that happens Bernstein will paint it as an enormous legal victory, but the fact is that these outcomes are absolutely routine.
When we were FOIA'ing the Police General Orders for all the suburbs of Chicago, my own municipality declined to release theirs. I'd already been working with Topic on a (much more important) FOIA case from a friend of mine, so I reached out asking for him to write a nastygram for me. The nastygram cost me money --- but he told me having him sue would not! It was literally cheaper for me to have him sue my town than to have him write a letter, because FOIA suits have fee recovery terms.
I really can't emphasize enough how much suing a public body to force compliance with FOIA is just a normal part of the process. It sucks! But it's utterly routine.
> If that is the case, then what is the explanation for NIST (according to DJB) 1. not communicating their decision process to anywhere near the degree that they vowed to, and 2. stone-walling a FOIA request on the matter?
Why are you asking me, when I was clear I was just stating my interpretation of his position, and he had already replied to me with even more clarification to his position?
> Communicating badly and then acting smug when misunderstood is not cleverness
I don't disagree. My observations should not be taken as endorsement for a specific type of behavior, if that's indeed what is being done.
That said, while I may dislike how the conversation plays out, I can't ignore that very often he has an intricate and we'll thought out position that is expressed succinctly, and in the few cases where someone treats the conversation with respect and asks clarifying questions rather than makes assumptions the conversation is clear and understanding is quickly reached between most parties.
I'm hesitant to lay the blame all on one side when the other side is the one jumping to conclusions and then refusing to accept their mistake when it's pointed out.
At the risk of belaboring the obvious: An attacker won't have to say "Oops, researcher X is working in public and has just found an attack; can we suppress this somehow?" if the attacker had the common sense to hire X years earlier, meaning that X isn't working in public. People arguing that there can't be sabotage because submission teams can't be bribed are completely missing the point.
He goes on to say:
I coined the phrase "post-quantum cryptography" in 2003. It's not hard to imagine that the NSA/IDA post-quantum attack team was already hard at work before that, that they're years ahead of the public in finding attacks, and that NSA has been pushing NISTPQC to select algorithms that NSA secretly knows how to break.
Does this seem unreasonable, and if so, why?
He also remarks:
Could such a weakness also be exploited by other large-scale attackers? Best bet is that the answer is yes. Would this possibility stop NSA from pushing for the weakness? Of course not.
Doesn’t sound to me like he only has concerns about bribery. Corruption of the standards to NSA’s benefit is one overarching issue. It’s not the only one, he has concerns about non-American capabilities as well.
The are many methods for the NSA to achieve a win.
Ridiculing people for worrying about this is totally lame and is harmful to the community.
To suggest a few dozen humans are beyond reproach from attack by the most powerful adversaries to ever exist is extremely naive at best. However that literally isn’t even a core point as Bernstein notes clearly.
FFS nobody is saying that the general idea of being skeptical is unreasonable. And nobody is being ridiculed for doing such. This subthread is about the contents of tptacek’s comment, which doesn't do what you are saying. Saying DJB’s claims are inconceivable is the mischaracterization. People are very eager to paint a picture nobody intended so they can say something and be right.
I use djb’s crypto. Everybody knows his speculation. Everybody knows why he’s pursuing more information. Nobody disagrees more information would be a public good. Some people are more skeptical than others that he’ll find anything substantial.
> If you RTFA you'd know it pertains to bribery, not coercion
By quoting the article it seems the text directly contradicts your summary as being too narrow. General coercion is also be included as part of the concerns raised by TFA. He isn’t just talking about NSA giving a person a sack of money.
Meanwhile in this thread and on Twitter, many people are indeed doing the things you say that nobody is doing.
We almost all use Bernstein’s crypto — some as mere users, others as developers, etc. I’m not sure what that brings to the discussion.
I’m glad we agree that his work to gather more information is a public good.
The article discusses it generally but uses bribery as the example. Perhaps that’s the confusion. Someone said the idea that we’re gonna find bribes is silly. Someone else said that’s insane, how could you not imagine the govt doing something coercive. Reply was that’s not what I said. Another challenge follows asserting that the gov’t is generally shady and coercive. I tried to clarify what I see as the confusion (bribery vs coercion as an example used in the article). Sorry if my statement was overly broad, my intention was to say we’re probably mostly on the same side and arguing over semantics. Maybe not all of the world is (e.g. Twitter), but it seemed like the case here. Maybe not and tptacek believes the gov’t is infallible. IDK. I like DJB and appreciate what he’s doing.
Maybe he does know what risible means and is in fact extremely well informed, much better informed than you are, to the point where offering sarcasm on the apparent basis of absolutely nothing but what you've learnt from the internet is actually not a valuable contribution to the conversation but instead embarrassing. Have you considered this possibility as well?
I think it's naive and trusting only on the surface, but with some clear intent and goal underneath. In the past he has held a different stance, but it suddenly changed some time after Matasano.
Can I ask that, if you're going to accuse me of shilling in an HN thread, you at least come up with something that I'm shilling? I don't care what it is; you can say that I'm shilling for Infowars Life ProstaGuard Prostate Health Supplement with Saw Palmetto and Anti-Oxidant, for all I care, just identify something.
It's very disconcerting, for the sake of open and honest discourse, that you or someone else decided to flag (and thus censor) my reply to this request.
I believe that NIST is obligated to be responsive to FOIA requests, even if the motivation behind those requests is risible.