Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
C++ creator rebuts White House warning (infoworld.com)
55 points by ben_s on March 18, 2024 | hide | past | favorite | 62 comments


> I find it surprising that the writers of those government documents seem oblivious of the strengths of contemporary C++ and the efforts to provide strong safety guarantees

No, Bjarne needs to realize that RAII and smart pointers are an old concept now, and they have shown to be insufficient, for decades now.

The bar for safety has been raised, and the Modern C++ is so behind, they don't even understand the issue, and still talk like it's about the C sins, or programmers not using it properly, or "but what about other bugs?".

The contemporary C++ is still a safety-third design where safety always loses to performance and backwards compatibility. The std::span has been standardized without .at(), and they're just getting around to adding it. Bounds checking by default on `operator[]` is one thing where C++ could be on par with Rust right now, but that is going to be relegated to an optional profile.


Anecdotally, I feel like I wrote safer C code because at least I fully understood the behavior of the few standard library functions I used - both the good and the bad. C++ always had that question mark of if you were fully understanding all of the invariants the library placed on you to uphold.

That being said, I've fallen in love with Rust and have no intention of ever going back to C or C++.


I have a big gripe with rust and other modern languages like zig, and it’s difficult to reconcile it.

The gripe is that after using them for a while, anything else like java, python, c++ or C is underwhelming, boring and tedious, an exercise in how long you can uphold standards until you fall back to language defaults.


What these languages show, is something anyone used to compiled BASIC, Pascal dialects like Object Pascal, Modula-2 and such, already knew back in the day.

We had languages that were a pleasure to use for high level programming abstractions, while at the same time, provided the necessary features to go all the way down, even inline Assembly if it must be.

Without the culture that tooling much be hard, rather developers are users as well.

Thankfully a new generation of developers is bringing this culture back.


> Without the culture that tooling much be hard, rather developers are users as well.

I don't follow this. Could you elaborate?

I think the current culture is that tooling must be smart, accessible, and very well written, e.g rust (cargo, rustc, clippy, rust-analyzer), go coming with everything, same for gleam. Even zig is a better build system for C programs than most buildsystems for C.


I am talking about the C tooling culture, specially when comparing how compilers, linkers and makefiles are exposed, versus the Xerox PARC programming model that was largely adopted by other ecosystems.

Think TP, Delphi, VB, .NET, Smalltalk, Common Lisp, Clipper, Eiffel, MacOS AppToolbox, Java,...


The problem is that he is shouting to windmills, because the community at large doesn't care about all the features and tooling he keeps mentioning. Also some of that tooling is still quite buggy, e.g. VC++ and clang lifetime checkers.

Same applies to Herb Sutter's efforts, as of his recent blog post.

It doesn't matter how much tooling there has been available on the C and C++ ecosystem, a large majority will keep programming as they always did unless forced by external factors to change their ways.

Lint was created in 1979, and to this day many developers refuse to adopt static analysis on their C and C++ projects, let alone more modern tooling.

Technology cannot change community culture.


not all applications need safety

it is wrong to force safety onto the programmer especially if there is a cost to be paid to obtain it


That's an easy statement to make but in practice it's no longer acceptable; the Internet has become far too hostile and code that's "never going to be connected to the Internet" constantly does.


not all devices are connected to the internet

and writing software in c++ does not imply the software is unsafe


Stroustrup as always fails to recognize the vast surface area of C++ features, foot cannons, and the heavy weight of C compatibility around C++ neck.

C++ barely made sense in 1995. It makes absolutely no sense today.


Versus C, it definitly made lots of sense.

For me it was the only language I could see myself using after getting used to Object Pascal quality of life, in tooling, type safety, and language features.

C felt jurassic already in 1993, and there wasn't any other portable alternative I could reach for.

However that copy-paste compatibility with C is what definitly has made it unsuitable for writing safe code nowadays, specially with the C culture that is now left.

Those of us that like C++, but also are on the safety side, have already embraced other programming languages, relegating C++ for the absolute must use cases.


I would argue that it made sense for performance-sensitive code until the maturation of Rust, so it made sense even in 2015. But not today.


you pay a performance penalty

and i don’t want to learn another language and package management system


are there any game engines for AAA development in rust?


makes sense to me


About a year ago, Stroustrup wrote a similar response[0] to a similar statement from the NSA[1].

I wasn't really convinced and am more in agreement with this response to Stroustrup[2] from an embedded Linux developer. To quote from near the end of their response to Stroustrup:

> In the meantime, it is unfair for Dr. Stroustrup to call safe programming languages novelties or to pretend that C++ isn’t already far behind the times on this. This was already an important criticism of C++ decades ago, when Java first came out in the 90’s and was referred to as a “managed programming language.”

I'm not really buying that C++ is really all that commonly safe when we have stats like these[3]:

> A recent study found that 60-70% of vulnerabilities in iOS and macOS are memory safety vulnerabilities. Microsoft estimates that 70% of all vulnerabilities in their products over the last decade have been memory safety issues. Google estimated that 90% of Android vulnerabilities are memory safety issues. An analysis of 0-days that were discovered being exploited in the wild found that more than 80% of the exploited vulnerabilities were memory safety issues 1.

> The Slammer worm from 2003 was a buffer overflow (out-of-bounds write). So was WannaCry (out-of-bounds write). The Trident exploit against iPhones used three different memory safety vulnerabilities (two use-after-frees and an out-of-bounds read). HeartBleed was a memory safety problem (out-of-bounds read). Stagefright on Android too (out-of-bounds writes). The Ghost vulnerability in glibc? You betcha (out-of-bounds write).

[0]: https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2023/p27...

[1]: https://media.defense.gov/2022/Nov/10/2003112742/-1/-1/0/CSI...

[2]: https://www.thecodedmessage.com/posts/stroustrup-response/

[3]: https://www.memorysafety.org/docs/memory-safety/


> I'm not really buying that C++ is really all that commonly safe when we have states like these[3]:

It's not. At this point, I cannot in good faith say that he is not lying, either to us or himself.


I don't think he's lying to us. I think he genuinely believes that everyone is using it wrong.

Although that's not much of a defense when he was the one who invented the wrong ways to use it.


It's hard to get a man to understand something when his salary (or hobby or self-image) depends on his not understanding it.


It’s probably not a coincidence that he’s the author of the famous quote that may and is often used to deflect any and all criticism of a programming language.


C and C++ are HARD to use correctly, but how many of those 60-70% vulnerabilities would have been resolved by just compiling with llvm address sanitizer? It would have stopped virtually all of them?

https://llvm.org/pubs/2006-05-24-SAFECode-BoundsCheck.pdf https://clang.llvm.org/docs/AddressSanitizer.html

In many cases we already have the tools. The problem is that people are not using them.

That said it is still in general a good thing to steer people away from C / C++ due to the languanges being very hard to use correctly.


> In many cases we already have the tools. The problem is that people are not using them.

The problem with these tools are that the instrumentation code inserted by the compiler comes with a 50-100% program-wide performance loss (*) and that's not acceptable to C++ developers. So in practice, you don't just add -fsanitize=address to your builds, you add it to test builds and fuzz them. But now you're not just trusting your compiler, you're trusting your tests and coverage.

The promise of Rust is that many of the memory safety bugs are forbidden at compile time in safe code, and the stuff that has to be checked at runtime (self referential data structures, out of bounds, etc) is able to be added more granularly with unsafe opt-outs where appropriate which means that you're not going to pay 50-100% in raw performance.

* take this like all perf numbers with a heap of salt, do your own benchmarks and come to your own conclusions.


To developers that cargo cult performance, that is.

I have always enabled bounds checking, and never ever, did it matter for the kind of projects I was involved with.

Not everyone is really writing a VR engine for a console rendering at 120 FPS, but just like everyone wants to be Google, so do much of those developers.


> It would have stopped virtually all of them?

Do you have a source for that claim?


The problem is that there are more compilers out there than clang, and clang isn't available everywhere.


asan only catches what your test suite covers. specifically, that doesn't include novel attacks.


That's kinda true, but if you use the compiler inserted address sanitizer code it will turn bugs from exploits into crashes. You can't exploit a OOB write if the write fails and the program crashes.


if you can afford the cost of that, just write your program in any other language c with asan is a lot slower than the safe alternatives


I made another comment about exactly that, but was addressing the concern that asan doesn't make code safe.


They are hard to use when you absolutely want to use them the hard way, std::span can provide you lots of safety, but people don't use available tools.


It provides none, unless you are using C++26 compiled with hardned standard library mode.

If you care about safety use gsl::span instead.


Significantly though - almost all of those are in C code, even if C++ is used elsewhere in the program.

C++ does no good if you don't #pragma GCC poison all the C-isms.


On a recent CppCon panel[0], after much discussion of ecosystem-wide safety and how "safe" programming languages don't automatically create "safe" systems (which has merit), he basically had to shrug his shoulders when someone asked about thread safety.

[0]https://www.youtube.com/watch?v=R10pXWHpPn4


> Of the billions of lines of C++, few completely follow modern guidelines

AKA, we’ve created a messy monster that’s out of our control, please don’t blame us.


But it seems like blaming the inventor of knives for using them dangerously? Knives are really useful for cutting things.

The same goes for cars. Thousands of people have died in car accidents yet nobody is proposing that we replace cars because if you use them correctly they really are quite useful.

The same goes for C++. I see a lot of C++-bashing on this thread (it seems very popular on HN?) but it is a useful language. I have worked with plenty of people who used it wrong but to throw the entire thing into the bin is blaming the wrong thing. People can be dangerous drivers even if the car is really good.


Sure, but it’s more like someone invented and sold a car then strapped a bunch of knives to it.


Maybe they need to fork a version of C++ that doesn't have all the dangerous features and forces developers to write only safe C++. It could be called C++--.



Carbon is still a research project, the authors are the first to state to use Rust if one doesn't need to migrate existing C++ code.


The overarching point is to avoid new C++ code whenever possible. Carbon isn't intended to compete with or replace Rust. It's for more easily and seamlessly migrating away from C++ and its worst foot cannons.


See Circle.


I understand wanting to defend your baby, and I think C++ was VASTLY better than most other languages for performance sensitive code until maybe 5 years ago - but the combination of memory safety and thread safety that Rust offers means that there are very few situations where C++ is the appropriate choice for a new project now.

I have written professionally in C++ for 20 years now, and I would pick Rust for a _new_ project/fresh codebase in a heartbeat. tokio alone is so vastly superior to anything you can do in async C++ that it makes zero sense to select C++ for a _new_ project (obviously if you have an existing codebase the price of interop may not be worth it).

Bjarne is being deliberately obtuse here.


>Improving safety has been an aim of C++ from day one and throughout its evolution.

Wut, pretty sure the first aim was compatibility with C, the second aim was flexibility and overengineering, the third aim was performance. Safety was never on the list.


I agree; you can write perfectly safe and noncompromisingly performant code in C++ much better than any language today.

It's genuinely difficult to understand why Rust gets rammed down everyone's throats lately; do they really think developers coming out of universities today are too stupid to grasp memory management? If that's the case, why not have everyone code in Scratch?

That memo reads like "to avoid wet pants, everyone should now pee sitting down"; some of us can aim...


We've been trying the "just don't write bugs" strategy for over 40 years now, and it's not working. Framing this as a problem with people being too stupid is a completely unproductive mix of hubris and elitism.

In the disciplines where real safety is required (like engineering, aviation, medicine), it's accepted that people will make mistakes. When a system can fail catastrophically due to a simple human error, it's the system that is broken, and needs to be made more robust.


I didn't say people are too stupid. On the contrary, we onboard fresh grads onto our large C++ code base every once in a while and they all seem to grasp the concepts just fine.

I'm wondering if most folks throwing shade at C++ had used pre-C++11 toolsets and just have bad memories of the experience.

FWIW, I find modern C++ genuinely great to read / easy to parse by humans (same for similar languages like C#, Java, JavaScript/Typescript) whereas reading Rust (in particular function definitions) is painful.


IME this is mostly a familiarity problem not an actual syntax problem.


No, I mean humans in general, not software developers. Seriously, show a non-developer a printout of some average modern C++ code and some average rust code and see which one they think is easier to read and more visually pleasing.


Average Rust code uses a fair number of lambdas, you won't convince me that C++ has a readable / ergonomic syntax for lambdas.


See log4j vulnerability, you still need "just don't write bugs" with safe languages. What was really tried was "use C the hard way", which fails regularly as one can expect. Projects that use C the easy way have much better safety.


See ShellShock. Having memory safety vulnerabilities doesn't prevent other bugs. C and C++ projects still have logic errors, broken auth, XSS, SQL injection, and do dangerously dumb stuff, and that's on top of buffer overflows, user-after-frees, data races, and UB footguns.

Nobody promises that memory safety will fix all bugs, but it can prevent or significantly reduce a class of vulnerabilities, and reduce the total number of serious defects. And then time and effort saved on dealing with memory corruption bugs can be redirected towards dealing with all the other higher-level issues.

> Projects that use C the easy way have much better safety.

That's just another way of blaming programmers for not writing C without the bugs.

Every language can be perfectly safe if used correctly — even hand-written machine code. The problem is that it's easy to say "use C the easy way" (whatever that means), but actual real-world uses don't live up to such standard, and even the best programmers can make mistakes. Language safety is about making programs safer even when programmers write less-than-ideal code.


You're talking about using C the hard way, that's difficult to get right indeed, because it's the hard way.


Safe modern c++ is poorly understood compared to the decades of experience and knowledge corpus around old practices. The unsafe ways are still easily accessible and widely used, taught, and understood. C++ also is laden with complexity and idiomatic styles, immense foot cannons, etc.

Rust is fairly explicit with what’s unsafe (ie unsafe) and safety is default. It’s also generally a simpler language overall with less accretion of backwards compatibility. There aren’t decades of bad habits still usable in rust. It’s got modern sugar, first class functional constructs, and it’s generally cooler in the way metal was cooler than big band was in 1988.


Scratch is simpler than Rust and a lot safer. That's absurd, but maybe that's where we're headed?


It’s not very metal tho


Right, tell me you've never made an off by one error.

Or had a dangling reference in your program.


> It's genuinely difficult to understand why Rust gets rammed down everyone's throats lately; do they really think developers coming out of universities today are too stupid to grasp memory management?

Yes. Experimentally, that is absolutely correct. New programmers are bad at memory management. So are old ones. So are experts, unless you seriously want to argue that the OpenBSD gang are a bunch of amateurs.

The best C/C++ programmers in the world still write code with memory flaws today. At some point you have to say, you know, maybe it's not the programmers that are the problem.


name one piece of open source software in C or C++ with more than a couple thousand commits that has not had to fix a memory safety bug.


There are millions of pieces of software in C or C++ with more than a couple thousand commits, whereas with rust there are maybe a handful (the rust compiler itself, Servo, Deno, and... ?), and I bet they all had to fix memory safety bugs (except I guess we call them "soundness issues" in Rust).

This argument reads like the old "Linux / macOS is safer than Windows, look at all the malware that targets Windows" while ignoring that at the time Windows had 95%+ market share for desktop.


agree not sure why this is downvoted


> It's genuinely difficult to understand why Rust gets rammed down everyone's throats lately

Most software companies have a revolving door of developers and they need cookie cutter tools to limit the damage a "bad" developer can cause. In the 1990's Java OOP was hyped as the solution to procedural spaghetti code (because it is impossible to architect good software without objects), in the 2010's it was web frameworks (because it is impossible to build a web app without a framework), and in the 2020's it is Rust (because it is impossible to write C/C++/Assembly without memory bugs). The Rust hype cycle is yet another Big Corp push for more guardrails.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: