Here we go. We'll now have the first traffic fatality trial where it's not drivers-trying-drivers but people-trying-a-megacorp.
(Disclaimer: I'm a bike advocate, so I may have a different perspective on some of this than most.)
Our car-based transportation system is far and away the most dangerous thing any of us accept doing on a daily basis. 40,000 die a year.
But when cases come to court, everyone on the jury has in the back of their mind "that could have been me if I lost concentration at the wrong moment, or made one bad judgement, etc etc."
So penalties are comparatively light for traffic fatalities. Big punishments are only meted out if the case is so egregious -- repeated drug use, flagrantly reckless behavior -- that the jury can be convinced that the driver is different from them.
In other words, drivers don't get punished for doing something dangerous, because everybody on the road is doing something dangerous. They get punished for doing something more dangerous than the norm.
In this case, there's no question that the "driver" is different than the jury -- it's a computer. Now the symmetry that made jurors compare themselves to the accused is broken.
The result, and what self-driving car advocates don't get, is that self-driving cars don't just have to be safer than human drivers to be free of liability, they need to be safe period. In a trial, they don't benefit from the default "could have been me" defense.
That's a HUGE requirement. In fact, it's probably impossible with our current road system. It won't just take better self-driving cars, but better roads and a major cultural change in our attitudes about driving.
As a bike advocate, I welcome this shift, but I also see how deluded many of the current self-driving projects are. Software moves fast, but asphalt and mentalities move slow. We're not years away from a self-driving transportation system, we're decades.
And this trial is just the beginning of that long story.
Most drivers avoid the most serious penalties, and more importantly most victims are denied restitution, not because of sympathetic juries but because of insolvency.
The chance someone will cause a grave accident and the chance someone is insolvent are not independent variables.
You get some real horror stories in law school about people trapped in burning vehicles with no remedy for the surviving family.
One way you can solve for this is strict products liability. Normally we ask if there's a defect in design or negligence. That produces costly litigation where a lay panel reviews a bunch of technical engineering documents with no special training, with highly varying outcomes. Instead, just have the manufacturer pay some statutory compensation whenever their product causes harm. The social harm of the thing will be borne by its purchasers.
It's hard to do that with car accidents though, because driver error contributes so much to outcomes, it's often weird to punish the company.
But we could vastly simplify the auto insurance system and improve the safety of these vehicles by forcing them to just warranty against harm once drivers are out as a factor. (Insofar as a warranty is a promise not to do harm with a fixed penalty up front based on breach.)
It would be fitting if such a statutory regime were the earliest primary laws about AI, because that would officially make a requirement that ai does not harm humans into literally the first law of robotics.
Moreover, "Wrongful death settlements are often paid out by insurance providers who provide liability coverage for the person or entity for whom the death is being blamed. Insurance policies typically have a policy limit amount, above which the insurance company will not pay and the person is individually liable" from https://www.google.com/url?sa=t&source=web&rct=j&url=https:/...
Although, as far as I know, were the jury to become aware of this, a mistrial should be declared.
I agree. People wish not to avoid death, but to lessen their fear of death.
Arguments about how relatively safe self-driving cars will be are beside the point. A self-driving car is something that can kill you. The fact that death by self-driving vehicle is especially rare makes it especially scary.
A low accident rate is not a good thing, psychologically. Only when a risk is comfortably common can we comfortably begin to ignore it.
There's a current trend for people to anthropomorphise cars, with the backing of the industry (two eyes/lights and a mount/grill on the front of "him/her"). Won't the industry go into overdrive on the anthropomorphism front to try and preserve this judicial empathy?
(Disclaimer: I'm a bike advocate, so I may have a different perspective on some of this than most.)
Our car-based transportation system is far and away the most dangerous thing any of us accept doing on a daily basis. 40,000 die a year.
But when cases come to court, everyone on the jury has in the back of their mind "that could have been me if I lost concentration at the wrong moment, or made one bad judgement, etc etc."
So penalties are comparatively light for traffic fatalities. Big punishments are only meted out if the case is so egregious -- repeated drug use, flagrantly reckless behavior -- that the jury can be convinced that the driver is different from them.
In other words, drivers don't get punished for doing something dangerous, because everybody on the road is doing something dangerous. They get punished for doing something more dangerous than the norm.
In this case, there's no question that the "driver" is different than the jury -- it's a computer. Now the symmetry that made jurors compare themselves to the accused is broken.
The result, and what self-driving car advocates don't get, is that self-driving cars don't just have to be safer than human drivers to be free of liability, they need to be safe period. In a trial, they don't benefit from the default "could have been me" defense.
That's a HUGE requirement. In fact, it's probably impossible with our current road system. It won't just take better self-driving cars, but better roads and a major cultural change in our attitudes about driving.
As a bike advocate, I welcome this shift, but I also see how deluded many of the current self-driving projects are. Software moves fast, but asphalt and mentalities move slow. We're not years away from a self-driving transportation system, we're decades.
And this trial is just the beginning of that long story.