unqualified people on the internet shouldn't give legal advice, but since we're doing it anyway: no, this is definitely not true and if you make assurances about the fitness of the good you are on the hook when it fails, even if it's some absurdly improbable "I had no way of knowing that our pencils intended for schoolchildren would be used on a spacecraft" situation.
There is a reason you will see a ton of warranties and terms of service/EULA specifically forbid or disclaim the use in life-critical situations, in which case you are safe, because you said don't do it. But if you don't, you generally are going to be liable.
Sadly there is a reason the chainsaws say "do not stop chain with genitals". Not only did someone probably do that, but the damages stuck.
example, I was talking about the CUDA license yesterday and of course one of the clauses is:
> You acknowledge that the SDK as delivered is not tested or certified by NVIDIA for use in connection with the design, construction, maintenance, and/or operation of any system where the use or failure of such system could result in a situation that threatens the safety of human life or results in catastrophic damages (each, a “Critical Application”). Examples of Critical Applications include use in avionics, navigation, autonomous vehicle applications, ai solutions for automotive products, military, medical, life support or other life critical applications. NVIDIA shall not be liable to you or any third party, in whole or in part, for any claims or damages arising from such uses. You are solely responsible for ensuring that any product or service developed with the SDK as a whole includes sufficient features to comply with all applicable legal and regulatory standards and requirements.
Why is this here? because they'd be liable otherwise, and more generally they want to be on the record as saying "hey idiot don't use this in a life-critical system".
There might well be a clause like that in crowdstrike's license too, of course. But the problem is it's generally different when what you are providing is a mission-critical safety/security system... hard to duck responsibility for being in critical places when you are actively trying to position yourself in critical places.
>Sadly there is a reason the chainsaws say "do not stop chain with genitals". Not only did someone probably do that, but the damages stuck.
This is very dependent on your jurisdiction. The USA's laws leave a lot more room for litigating in a way which I would deem frivolous than those of Canada. If you sell a chainsaw with safety features that adhere to common standards you should reasonably expect people not to try to stop it with their ballsack and a court of law that holds the manufacturer liable for moronic use of the object is a poorly designed court.
I think you have it flipped. Any clause in a contract is a negotiation. Warranties and insurance coverage are part of it.
Any smart CIO would have said - ill take what you sell but if you fail i can come back and haunt you and you are going to give me endorsement for your product insurance and i'll require upping your coverage + notifications of you being up to date with your insurance policy that has 50XXX M in coverage, minimum.
If the software is sitting on top of your business core IT, you must protect the business in its entirety by demanding a proportional shield, and using IT vendor's own IT insurance shield as if it was your own. And demanding more coverage, if the shield is too small . Then once those elements are in place, you are protected. Its as simple as that.
There is a reason you will see a ton of warranties and terms of service/EULA specifically forbid or disclaim the use in life-critical situations, in which case you are safe, because you said don't do it. But if you don't, you generally are going to be liable.
Sadly there is a reason the chainsaws say "do not stop chain with genitals". Not only did someone probably do that, but the damages stuck.
example, I was talking about the CUDA license yesterday and of course one of the clauses is:
> You acknowledge that the SDK as delivered is not tested or certified by NVIDIA for use in connection with the design, construction, maintenance, and/or operation of any system where the use or failure of such system could result in a situation that threatens the safety of human life or results in catastrophic damages (each, a “Critical Application”). Examples of Critical Applications include use in avionics, navigation, autonomous vehicle applications, ai solutions for automotive products, military, medical, life support or other life critical applications. NVIDIA shall not be liable to you or any third party, in whole or in part, for any claims or damages arising from such uses. You are solely responsible for ensuring that any product or service developed with the SDK as a whole includes sufficient features to comply with all applicable legal and regulatory standards and requirements.
Why is this here? because they'd be liable otherwise, and more generally they want to be on the record as saying "hey idiot don't use this in a life-critical system".
There might well be a clause like that in crowdstrike's license too, of course. But the problem is it's generally different when what you are providing is a mission-critical safety/security system... hard to duck responsibility for being in critical places when you are actively trying to position yourself in critical places.