Waymo and Cruise routinely have driverless cars on city streets. In California, all collisions, however minor, have to be reported, and DMV posts them on their web site.[1] Most are very minor. Here's a more serious one from last month:
"A Cruise autonomous vehicle ("Cruise AV") operating in driverless autonomous mode, was traveling eastbound on Geary Boulevard toward the intersection with Spruce Street. As it approached the intersection, the Cruise AV entered the left hand turn lane, turned the left
turn signal on, and initiated a left turn on a green light onto Spruce Street. At the same time, a Toyota Prius traveling westbound in the
rightmost bus and turn lane of Geary Boulevard approached the intersection in the right turn lane. The Toyota Prius was traveling
approximately 40 mph in a 25 mph speed zone. The Cruise AV came to a stop before fully completing its turn onto Spruce Street due to the
oncoming Toyota Prius, and the Toyota Prius entered the intersection traveling straight from the turn lane instead of turning. Shortly
thereafter, the Toyota Prius made contact with the rear passenger side of the Cruise AV. The impact caused damage to the right rear door,
panel, and wheel of the Cruise AV. Police and Emergency Medical Services were called to the scene, and a police report was filed. The
Cruise AV was towed from the scene. Occupants of both vehicles received medical treatment for allegedly minor injuries."
Now, this shows the strengths and weaknesses of the system. The Cruise vehicle was making a left turn from Geary onto Spruce. Eastbound Geary at this point has a dedicated left turn lane cut out of a grass median, two through lanes, a right turn bus/taxi lane, and a bus stop lane. It detected cross traffic that shouldn't have been in that lane and was going too fast. So it stopped, and was hit.
It did not take evasive action, which might have worked. Or it might have made the situation worse. By not doing so, it did the legally correct thing. The other driver will be blamed for this. But it may not have done the thing most likely to avoid an accident. This is the real version of the trolley problem.
I think the way I would say it is, after a few tens to hundreds of thousands of such events, we'll be able to estimate the parameters of the human preference functions associated with a non-fatal trolley problem (the real trolley problem is irrelevant to the car ML folks, they are not aiming to maximize global utility) and that will help guide our confidence in rolling out more.
For all the woe and gloom in the news reporting, Google (and Cruise)'s rollouts have been more or less what I expected: no enormous accidents that were clearly caused by a computer, but instead, a small number of small accidents usually due to the human driver of another vehicle doing something wrong. That seems to lead towards greater acceptance of self-driving cars and confidence that they are roughly as good as an attentive newbie.
The next big situation, I think, will be some really large-scale pileup with massive damages and deaths, and a press cycle where the self-driving car gets blamed. But the self-driving car collected a forensic quality audit log, which of course will aid the police in determining which human caused the accident.
I've been reading those DMV reports for years, and there are clear patterns which repeat. One is where an autonomous vehicle started to enter an intersection with poor sight lines to the cross street. Sensing cross traffic, it stopped, and was then rear-ended by a human-driven vehicle that was following too closely. Waymo has that happen twice at the same intersection in Mountain View. There's a tree in the median strip there which blocks the view from their roof sensor until the vehicle starts to enter the intersection. So the Waymo system advances cautiously until it has good sensor coverage, then accelerates or stops as required.
Humans tend not to do that, and, as a result, some fraction of the time they get T-boned.
AI should absolutely mimic the behavior of real (good) drivers.
Although that's not what you're describing here, another problem for AI could result from it knowing more than an average driver; for example, if a high-mounted LIDAR were able to see around corners and let the car decide it's "safe" to do a turn that no human would attempt for lack of visibility, that could cause problems.
(Also, it's surprising that an autonomous car doesn't detect that another car is following it too closely, and slows down appropriately in anticipation. How is this not taken into account.)
An autonomous vehicle has to protect its occupants at the expense of everything else (or, at the very least, appear to do so in a convincing manner), because otherwise no one will step inside.
(At the very least, if a machine is going to sacrifice me or my family to save a third party, I need to know what hierarchy it is following, and how it was decided and by whom.)
But what this incident seems to illustrate is that it's difficult for a self-driving car to share the road with human drivers, and behave like a human -- meaning, allowing human drivers anticipate what it will do.
I drive a motorcycle and a bike in Paris; the reason I'm still alive is because after so many years of this I can tell what all the other cars will do at all times, before they know it themselves.
But an autonomous vehicle that would behave so differently from a human driver as to be unpredictable, would be terrifying.
Totally agree; beginner and truly dangerous drivers share a quality: unpredictability. Even the most aggressive, fast drivers can easily be accounted for when defensive driving. Unpredictable drivers are how collisions happen when speed and weather are not factors.
Even the simple automatic emergency braking features in my model 3 result in some dangerously unpredictable behaviour at times. I have them set as off and insensitive as possible and they still do some awful stuff from time to time. I was on a road trip in northern Ontario this week passing a service vehicle moving very slowly along the shoulder on the right. I was doing 105 km/hr in the right lane of a 3 lane road: 2 lanes in my direction and 1 opposing. The 2nd lane switches directions every 5-10 km to serve as a passing zone. Posted speed limit is 80 km/hr, but prevailing speeds on this road are 90-110 and you’d never be ticketed for anything under 110 in this region. There was a truck gaining on me coming up on the left, so I signalled left and partially moved over to give the service vehicle some room, but didn’t fully take the left lane to let the faster truck know I would let them through as they came past. Very common pattern on this type of road and circumstance. The AEB slammed on the brakes as I came level with the service vehicle despite the fact there was plenty of space to complete the pass. I wasn’t expecting my car to slow down let alone apply emergency braking force, so in my surprise I nearly collided with the service vehicle. I have no idea what the other drivers thought, but neither of them could have possibly been predicting or expecting me to slam on the brakes. I was really upset with the car since it took a highly dangerous action in an otherwise perfectly safe and common situation. And that was the automatic stuff that can’t be turned off; I don’t let autopilot drive ever because it is the worst type of driver: unpredictable. The choices it made in the extremely short time I tested it about lane placement, follow distance, defensive driving (and the complete lack thereof), and general behavioural clues provided to other drivers were genuinely terrifying.
> By not doing so, it did the legally correct thing. The other driver will be blamed for this. But it may not have done the thing most likely to avoid an accident. This is the real version of the trolley problem.
It performed the only sane solution to the problem - stopping. You can't possibly predict what the most likely thing is to avoid an accident, because there is a multitude of factors beyond just 2 cars moving towards each other. Are there pedestrians present? Other parked cars? Storefronts with customers inside? How close to the sidewalk? Trees? Construction/Debris?
Once we get AI working better than 99.9% of humans on roads with speed limits of up to 30 mph - we can expand it to faster roads and introduce advanced behaviors. But for now, stopping in uncertainty is the best available option.
> It performed the only sane solution to the problem - stopping. You can't possibly predict what the most likely thing is to avoid an accident, because there is a multitude of factors beyond just 2 cars moving towards each other. Are there pedestrians present? Other parked cars? Storefronts with customers inside? How close to the sidewalk? Trees? Construction/Debris?
Not only there are other factors, but you also need to predict what the other human driver will do. As stated in a sibling comment this is a huge part of safe driving.
> Once we get AI working better than 99.9% of humans on roads
But what you stated above require AGI, so what's the plan to get there? It's even more blurry than commercial fusion at that point.
Yes, this applies to autonomous vehicles being tested in the state. Tesla skirts this rule by reporting their system (including the supposedly ‘Full Self-Driving’ beta) as level 2 driver assistance features. That marks them out of scope for the regulation of reporting collisions (and crucially system disconnections).
"A Cruise autonomous vehicle ("Cruise AV") operating in driverless autonomous mode, was traveling eastbound on Geary Boulevard toward the intersection with Spruce Street. As it approached the intersection, the Cruise AV entered the left hand turn lane, turned the left turn signal on, and initiated a left turn on a green light onto Spruce Street. At the same time, a Toyota Prius traveling westbound in the rightmost bus and turn lane of Geary Boulevard approached the intersection in the right turn lane. The Toyota Prius was traveling approximately 40 mph in a 25 mph speed zone. The Cruise AV came to a stop before fully completing its turn onto Spruce Street due to the oncoming Toyota Prius, and the Toyota Prius entered the intersection traveling straight from the turn lane instead of turning. Shortly thereafter, the Toyota Prius made contact with the rear passenger side of the Cruise AV. The impact caused damage to the right rear door, panel, and wheel of the Cruise AV. Police and Emergency Medical Services were called to the scene, and a police report was filed. The Cruise AV was towed from the scene. Occupants of both vehicles received medical treatment for allegedly minor injuries."
Now, this shows the strengths and weaknesses of the system. The Cruise vehicle was making a left turn from Geary onto Spruce. Eastbound Geary at this point has a dedicated left turn lane cut out of a grass median, two through lanes, a right turn bus/taxi lane, and a bus stop lane. It detected cross traffic that shouldn't have been in that lane and was going too fast. So it stopped, and was hit.
It did not take evasive action, which might have worked. Or it might have made the situation worse. By not doing so, it did the legally correct thing. The other driver will be blamed for this. But it may not have done the thing most likely to avoid an accident. This is the real version of the trolley problem.
[1] https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...
[2] https://earth.google.com/web/@37.78169591,-122.45337171
[3] https://patch.com/california/san-francisco/speed-limit-lower...