> But I'm at a loss as to how you can claim that getting rid of all the use-after-free bugs (just to name one class) inside a multi-million-line browser engine in a language with manual memory management is easy
I didn't claim it was "easy", just that bugs would still exist and the browser's "securedness" would only increase marginally. The problem is not that this isn't an achievement, only in managing expectations (e.g. saying Rust is secure which doesn't make sense).
> Nobody has ever succeeded at it, despite over a decade of sustained engineering effort on multiple browser engines.
It is a nice goal, but the question is where is the world afterwards? Will Firefox all of a sudden become substantially more secure and robust vs. its competition, to the extent that it can out compete them and increase its market share significantly?
My brief experience at Coverity makes me guess that you could be getting rid of one class of bugs without necessarily improving the product in any noticeable way...that ya, those bugs were common but not particularly easy to exploit or hard to fix once found.
Not that the effort isn't worthy at all. I'm just a bit cynical when it comes to seeing the tangible benefits.
> I didn't claim it was "easy", just that bugs would still exist and the browser's "securedness" would only increase marginally.
I disagree with the latter. Based on our analysis, Rust provides a defense against the majority of critical security bugs in Gecko.
> It is a nice goal, but the question is where is the world afterwards? Will Firefox all of a sudden become substantially more secure and robust vs. its competition, to the extent that it can out compete them and increase its market share significantly?
You've changed the question from "will this increase security" to "is improved security going to result in users choosing Firefox en masse". The latter question is a business question, not a technical question, and not one relevant to Rust or this thread. At the limit, it's asking "why should engineering resources be spent improving the product".
Rust is a tool to defend against memory safety vulnerabilities. It's also a tool to make systems programming more accessible to programmers who aren't long-time C++ experts and to make concurrent and parallel programming in large-scale systems more robust. The combination of those things makes it a significant advance over what we had to work with before, in my mind.
> My brief experience at Coverity makes me guess that you could be getting rid of one class of bugs without necessarily improving the product in any noticeable way...that ya, those bugs were common but not particularly easy to exploit or hard to fix once found.
It is true that exploitation of UAF (for example) is not within the skill level of most programmers and that individual UAFs are easy to fix. But "hard for most programmers to exploit and easy to fix" doesn't seem to be much of a mitigation. For example, the Rails YAML vulnerability was also hard to exploit (requiring knowledge of Ruby serialization internals and vulnerable standard library constructors) and easy to fix (just disable YAML), but it was rightly considered a fire-drill operation across Web sites the world over. The "smart cow" phenomenon ensures that vulnerabilities that start out difficult to exploit become easy to exploit when packaged up into scripts, if the incentives are there to do so. Exploitable use-after-free vulnerabilities in network-facing apps are like the Rails YAML vulnerabilities: "game-over" RCEs (possibly when combined with sandbox escapes).
>the browser's "securedness" would only increase marginally
The developers' claim is that more than half of all security bugs are bugs due to memory safety issues and that Rust will solve these. More than halving the number of bugs doesn't sound marginal to me.
I didn't claim it was "easy", just that bugs would still exist and the browser's "securedness" would only increase marginally. The problem is not that this isn't an achievement, only in managing expectations (e.g. saying Rust is secure which doesn't make sense).
> Nobody has ever succeeded at it, despite over a decade of sustained engineering effort on multiple browser engines.
It is a nice goal, but the question is where is the world afterwards? Will Firefox all of a sudden become substantially more secure and robust vs. its competition, to the extent that it can out compete them and increase its market share significantly?
My brief experience at Coverity makes me guess that you could be getting rid of one class of bugs without necessarily improving the product in any noticeable way...that ya, those bugs were common but not particularly easy to exploit or hard to fix once found.
Not that the effort isn't worthy at all. I'm just a bit cynical when it comes to seeing the tangible benefits.