Striking a pedestrian in a crosswalk is horrible and terrible news. The reality though is how many pedestrians are hit by manual driver's vs automous drivers? On a percentage basis I gotta believe autonomous cars are orders of magnitude safer. Self driving cars aren't going to be perfect. How many people lost their lives in early factories to machines during the industral revolution? Imagine if they had pulled the plugs back then.
> On a percentage basis I gotta believe autonomous cars are orders of magnitude safer.
Why?
NHTSA reports 1.15 fatalities per 100 million vehicle miles travelled in 2015.
Waymo advertises >5 million road miles travelled. Let's say Waymo + Uber have driven 10 million vehicle miles and killed 1 person. That makes them 10 times more dangerous than human driven cars.
This is the important statistic. Evidence so far points to autonomous drivers being worse than human drivers. In which they definitely have no place on the streets without a lot more R&D.
what about tesla though? by november 2016 they've accumulated 1.3 billion self droven miles [1]. autopilot was released in september 2014, so that's 2 years, or 600 million miles per year. and that was ramping up during those years, so i think it's safe to assume at least 1 billion miles / year these days.
2 deaths in over 2 billion miles is roughly 10 times safer than human drivers.
I don't think that is a fair comparison/calculation, autopilot is not "autonomous" (or not as autonomous as the Uber experimental car involved in this crash), and the demography of Tesla drivers is most probably much off "generic" car drivers, and driven miles, which include any kind of car (in the sense of including older cars in bad maintenance state or simply with inferior braking and steering systems when compared to brand new cars) and any kind of drivers (in the sense of drivers at a higher risk of accident).
Absolutely true, but keep in mind as well that each of these companies is running a completely different tech stack, for different use cases. They can't be lumped together for comparison.
Autopilot is not self-driving. The autopilot mode disengages if your hands are not on the wheel. Furthermore, users are advised to only use it in relatively safe situations.
>The reality though is how many pedestrians are hit by manual driver's vs automous drivers? On a percentage basis I gotta believe autonomous cars are orders of magnitude safer.
That might be a fair question, except that one death at this early stage of limited realistic trials makes me doubt the correctness of your belief. On the statistics that we have right now, I'm not seeing how autonomous cars are safer.
>Self driving cars aren't going to be perfect.
Shouldn't self-driving cars be held to a better standard rather than compared generously to objectively terrible existing standards? That we think of pedestrians regularly killed in "accidents" as acceptable collateral is already pretty horrific.
Our society has evolved quite a bit from the times of the Industrial Revolution. It’s possible for progress to be made while having life-saving regulations.
It appears the woman was not using the crosswalk, but really that doesn't make much difference in the reaction to the report. Uber's self driving tech seems a bit under-baked at the moment.
I always take the "outside the crosswalk" reports with a grain of salt. Hereabouts, if the pedestrian lands outside the crosswalk, they're assumed to have been crossing outside of it, even if later eyewitness reports and video footage show they were in the crosswalk and the car knocked them out of it.
There is also the problem that in some areas (e.g. California) all intersections are crosswalks even if unlabeled. It's likely that most pedestrians cross in a crosswalk whether they know it or not.
Arizona has something similar:
> By legal definition, there are three or more crosswalks at every intersection whether marked or unmarked.
Source: ADOT Traffic Engineering Guidelines and Processes, Section 910.1
I don’t know the number of hours their system has been driving but I highly doubt it is enough to draw comparisons to the average Arizona driver who hits 0 pedestrians. I also don’t think this works like this in the court of public opinion anyways.
This system can never make a mistake of this magnitude and be publically accepted.
Does it really need to be said that an autonomous vehicle still needs to manage to not kill people regardless of their position relative to any nearby crosswalks?
Today you can cross in the middle of the street and take a calculated risk that the people approaching down the street will stop or slow down enough to not kill you. If they go ahead and kill you anyway, they'll still be prosecuted because their attention should be on the road, and the most basic assumption behind their driver's license is that this person is competent enough and fit enough to drive what amounts to a deadly weapon without killing the people around them. When you violate that assumption, the fault is probably on you. Obviously there are exceptions, such as when somebody intentionally jumps in front of your vehicle, but crossing outside of the crosswalk to get across the street is not even close to the same level as trying to commit suicide and fault will be found accordingly.
> Does it really need to be said that an autonomous vehicle still needs to manage to not kill people regardless of their position relative to any nearby crosswalks?
If you require 100% impossibility of killing anybody regardless of what the vehicle and the person is doing - it is achievable only by making those vehicles nearly useless - such as lowering their max speed to something 10 mph (maybe even lower since it's still possible to push a person who will slip, hit their head on the pavement and die). If the vehicle is moving fast and somebody jumps onto a street, there are physical limitation of what can be done. So, if this technology is to exist there will always be a space where accidents can - and eventually will - happen.
If that's something the software can't handle well then it's going to be a huge problem in many parts of the world. In many places in Europe, Africa and Asia for instance pedestrians will cross anywhere and everywhere at any time.
>What do you do with an alogrithm to provide justice?
Assuming Arizona hasn't produced law specifically for fatalities in their public road self driving test program... The algorithms are just one detail of the total system design, and the design of the system isn't really the issue. People don't get killed by designs or code.
The verification of it's safety, and the decision to deploy that system on uncontrolled public streets will be the issue. People made these decisions, not an algorithm.
Fine/sue the entity that made the algorithm, just like every other situation like that. Or, if you can prove negligence on the part of someone inside that company, prosecute them.