Surely there are laws or a legal framework for what to do when a driverless car is pulled over. If not... why not impound it on the spot? If there was a violation and there is no driver then it's logical to conclude the software will duplicate the infraction, meaning it's legally untenable to allow the vehicle to continue to operate in autonomous service.
That seems like an expensive option that solves very little. Whatever the infraction is, it is just as likely to be repeated by any car in the fleet. Better for the police to have a way of dealing with it and document it so it get addressed. Certainly whatever agreement the company has with the city can address this, including the company possibly paying per-incident money to the police for their time.
Certainly they should, at the very least, be paying the standard fine amount. But a broken headlight, for instance, isn't something you just pay the fine for and then ignore. You have to keep paying the fine if you don't fix it.
You want to see confusion? Imagine coming up to a four way stop, with four cars, and some pedestrians, where one of the cars is a Waymo vehicle and among the pedestrians is a little Starship delivery robot [1]
Pure chaos. Welcome to Mountain View, California where the future is now.
First, when you come to a four way stop usually one of two things happen: Either everyone knows the order in which they arrived and act accordingly, or there's a moment of hesitation while you decide as a group who goes first. Usually this is done with eye contact and slowly nudging the car forward a bit, right? But there isn't anyone in a self-driving car to make eye contact with, so everyone just cuts it off, causing it to just sit there or move dangerously slowly into the intersection. I've experienced this more than once and it's unnerving. Less because of the automated car and more because of other driver's reaction to them.
Now, imagine a little delivery robot waiting to cross with the pedestrians, but for some reason doesn't, and is now stuck by itself, attempting to cross every 30 seconds or so, turning on a little flashing light and nudging out into the road. Then stopping and backing up onto the sidewalk.
(Legally, does a robot using a pedestrian crossing require me to stop? If it was human you would. Do I need to sit there for 5 minutes while it works out how to deal with the intersection? Is a delivery bot a pedestrian?)
Anyways, the combination of the above, which I witnessed once with my own eyes around the corner from my house, was an absolute mess. Mostly because while the delivery bot and the Waymo car were playing stop-and-go with each other, impatient people just drove around them.
In the California Driver's License test there's a whole section dedicated to Cable Cars in San Francisco, as they get special privileges. I think we need to come up with some basic rules and guidelines for sharing the road with autonomous vehicles and soon. I mean, look at this video! Even the cops are bewildered about what to do.
It's a pretty simple section if I'm remembering correctly. It basically says, "Cable cars always get the right of way," then goes into details in case that wasn't clear enough.
I'm surprised that af least one of the two parties (my money would be on the car) did not have a standard procedure for the encounter. Like the car could have provided an explanatory message and contact number. Overall though, I guess it's another edge case self driving cars need to handle.
The strangest part for me is when the car drives away from the officer, but then pulls to the side and puts on its flashers. If I had to guess, I'd say maybe it was in a mode where it started to drive away & sensed the police lights, then pulled over?
In any case, we've seen videos where police will literally fire their weapons into a car that tries to drive away from them. I guess this car got lucky today.
You can’t see from the video but I suspect the police car had tuned off it’s lights which caused the driverless car to proceed as normal, but then it pulled over again when the police car tuned them back on again.
You’d expect a driverless car to pull over to let emergency vehicles pass, not to actually be stopped by them.
Is it possible the car was given instructions to do that, to get out of the way of traffic? The place where it is initially is awkwardly near a "parklet" (outdoor dining area in a former parking spot) as well as right at an intersection and fire hydrant.
If you watch a few body camera videos, you’ll see police using cell phones a lot to talk to superiors. Better quality, doesn’t use radio bandwidth, more private, probably not recorded for the public record.