I am a big proponent of having proper liabilities across the industry, just like any other industry.
As a software enginneer professor of mine would joke, general computing quality is akin to buying a pair of shoes that randomly blow up if tied incorrectly, and people have been educated to put up with it.
With any other industry, if one buys something that doesn't properly work, usually the first reaction is to return to the shop and ask for the money back.
Thankfully digital stores, warranty contracts in project delivery, and ongoing cybersecurity laws, are already some steps into the right direction, yet there is still too much to be done.
EULA: By using this product, you (user) agree that:
1. It might kill your dog, and it will be your fault.
2. Any other harm or defects will be your fault as well.
3. You abandon all rights to sue us, ever, for anything.
4. Any dispute will be handled by our arbitration department.
5. We are free to spy on you and share all information with
our marketing partners or anyone else who asks.
C is used in situations where choose doesn't have to be reliable - wanna expand?
In my superficial understanding of computers, Linux is the paragon of reliability, so naively I would say that shows that C is good enough. However, in also aware that some instructions don't have such a great view of Linux, but I don't know much.
And that many of Linux's severe vulnerabilities were facilitated by C/gcc:
"The net result is that a PL/I programmer would have to work very hard to program a buffer overflow error, while a C programmer has to work very hard to avoid programming a buffer overflow error."
C/C++/Java/JavaScript/Python/etc. are used in situations where code doesn't have to be reliable, secure, or correct.
If customers actually cared about any of those things then our favorite tech companies wouldn't be worth zillions of dollars.