> But it does not change that the autonomous machine does not understand the complex world it is driving in in the slightest
It only needs to understand when to ask for help:
the driverless car had correctly yielded to the oncoming fire truck in the opposing lane and contacted the company’s remote assistance workers, who are able to operate vehicles in trouble from afar. According to Cruise, which collects camera and sensor data from its testing vehicles, the fire truck was able to move forward approximately 25 seconds after it first encountered the autonomous vehicle
I don’t understand why people think that driverless cars need to deal with every one in a million scenario, it makes no sense.
This phrase is doing a lot of work here. It's one of my least favorite phrases (in a close race with "just do this") and is often associated with unrealistic feature requests.
I'm sure there is a disagreement between SFFD and Cruise as to exactly what happened, but the article implies that the Cruise vehicle isn't the one that moved to fix the problem.
> The fire truck only passed the blockage when the garbage truck driver ran from their work to move their vehicle.
Even if the Cruise vehicle was able to call for help, the car not only needs to call for help, but also wait for a response (at 4am), and give a remote human enough information to control a car remotely in a safe manner. None of these things are easy... not impossible, but not "only needs" easy.
> driverless cars need to deal with every one in a million scenario
Of course driverless cars need to deal with one in a million scenarios. Human drivers deal with one in a million scenarios every day. Nothing is ever the same when driving, so there are always subtle changes. But even if there is an unusual situation, there must be some kind of response. Even if that response is to move to the side of the road and put on hazard lights to indicate that it doesn't know what to do (which it may have done)... that would have been a better response than to do nothing and sit to wait for a human. There should be a default "unknown input" failure mode. The disagreement here is that SFFD didn't like how the Cruise vehicle failed. Maybe there is a better approach.
We are expecting these vehicles to move us around 24/7. That's a lot of trips. At this rate, one in a million scenarios will happen every day. That's the problem with large numbers -- even rare events are to be expected when N is high enough.
> I don’t understand why people think that driverless cars need to deal with every one in a million scenario, it makes no sense.
That a particular situation is rare, does not mean that you won't encounter multiple rare situations in a given time frame.
(By the way, the article stated in the beginning that the garbage truck had to move? Either way, it required manual intervention and a human's situational awareness. How does the car ask for the proper help at freeway speeds within seconds--not even split seconds?)
It only needs to understand when to ask for help:
the driverless car had correctly yielded to the oncoming fire truck in the opposing lane and contacted the company’s remote assistance workers, who are able to operate vehicles in trouble from afar. According to Cruise, which collects camera and sensor data from its testing vehicles, the fire truck was able to move forward approximately 25 seconds after it first encountered the autonomous vehicle
I don’t understand why people think that driverless cars need to deal with every one in a million scenario, it makes no sense.