Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Ada is used for situations where the code has to work or people die.

C/C++/Java/JavaScript/Python/etc. are used in situations where code doesn't have to be reliable, secure, or correct.

If customers actually cared about any of those things then our favorite tech companies wouldn't be worth zillions of dollars.



I am a big proponent of having proper liabilities across the industry, just like any other industry.

As a software enginneer professor of mine would joke, general computing quality is akin to buying a pair of shoes that randomly blow up if tied incorrectly, and people have been educated to put up with it.

With any other industry, if one buys something that doesn't properly work, usually the first reaction is to return to the shop and ask for the money back.

Thankfully digital stores, warranty contracts in project delivery, and ongoing cybersecurity laws, are already some steps into the right direction, yet there is still too much to be done.


    EULA: By using this product, you (user) agree that:
    1. It might kill your dog, and it will be your fault.
    2. Any other harm or defects will be your fault as well.
    3. You abandon all rights to sue us, ever, for anything.
    4. Any dispute will be handled by our arbitration department.
    5. We are free to spy on you and share all information with 
       our marketing partners or anyone else who asks.


C is used in situations where choose doesn't have to be reliable - wanna expand?

In my superficial understanding of computers, Linux is the paragon of reliability, so naively I would say that shows that C is good enough. However, in also aware that some instructions don't have such a great view of Linux, but I don't know much.


Linux: written in C, hundreds of buffer overflow errors

Multics: written in PL/I, approximately zero buffer overflow errors

see also: https://www.cvedetails.com/product/47/Linux-Linux-Kernel.htm...


That doesn't tell me much. Linux is much bigger and more used than whatever multics is. So the fact that more bugs are documented is no surprise.


It should tell you that Linux is not a "paragon of reliability" unless you have a non-standard definition of paragon and/or reliability.


And that many of Linux's severe vulnerabilities were facilitated by C/gcc:

"The net result is that a PL/I programmer would have to work very hard to program a buffer overflow error, while a C programmer has to work very hard to avoid programming a buffer overflow error."

https://www.acsac.org/2002/papers/classic-multics.pdf




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: