Every lesson from aviation is earned in blood. This death wasn't necessary though. The Otto/Uber guys have been informed about their car's difficulty sensing and stopping for pedestrians. I know this because I informed them myself when one almost ran me down in a crosswalk in SF. Can't learn anything from your lessons unless you listen. Maybe they can pause and figure out how to actually listen to their reports of unsafe vehicle behavior.
It's interesting that the car was exceeding the speed limit of 35 mph. I would assume the car would stay at or below the speed limit. Who gets the speeding ticket in this case? Does 5 mph affect the reaction time such that it could have noticed and taken evasive action?
Whoever owns the algorithm. Or at-least in whoever's name the license/permission was issued. If its an organization, the top management signing off on this has to take the blame.
Legally, the person behind the wheel was still the driver. They are responsible for both the speeding and for killing a pedestrian. At this stage it's no different than using cruise control - you are still responsible for what happens.
I really hope you're wrong. If the legal system doesn't distinguish between cruise control and SAE level 3 autonomy, the legal system needs to get its shit together.
No, that's bullshit. It's physically impossible for a human to intervene on the timescales involved in motor accidents. Autonomy that requires an ever-vigilant driver to be ready to intervene at any second is literally worse than no autonomy at all; because if the driver isn't actively driving most of the time, their attention is guaranteed to stray.
I agree with you - but that's literally the stage we're at. What we have right now is like "advanced" cruise control - the person behind the wheel is still legally defined as the driver and bears responsibility for what happens. The law "allows" these systems on the road, but there is no framework out there which would shift the responsibility to anyone else but the person behind the wheel.
>> It's physically impossible for a human to intervene on the timescales involved in motor accidents.
That remains true even without any automatic driving tech - you are responsible even for accidents which happen too quickly for anyone to intervene. Obviously if you have some evidence(dashcam) showing that you couldn't avoid the accident you should be found not guilty, but the person going to court will be you - not the maker of your car's cruise control/radar system/whatever.
Currently have two cars; one Mazda '14 3 with AEB, Lane Departure alert, radar cruise, BLIS, rear cross alert - and the other an '11 Outback with none of that (but DSC and ABS, as well as AWD).
The assists are certainly helping more than anything, so I feel that the Mazda is much safer to drive in heavy traffic than the older Outback.
The cruise has autonomy over controlling the speed only, and applying brakes, but it is still autonomy. Of course since my hands never leave the wheel it may not fit with what you have in mind.
Having said that, Mazda (or Bosch?) really nailed their radar, having never failed to pick up motorbike riders even though the manual warns us to not expect it to work.
I feel more confident in a system where the ambition is smaller, yet execution more solid.
Fwiw I also tested the AEB against cardboard boxes driving through them at 30km/h not moving accelerator at all, and came away very impressed by the system. It intervened so last second I felt for sure it wasn't going to work, but it did - first time was a very slight impact, next two were complete stops with small margins.
This stuff is guaranteed to save lives and prevent costly crashes (I generally refuse to use the word "accident") on a grander scale.
Bullshit?? It may be autonomous but these cars are still far away from driverless. YOU get in the car, you know the limitations, you just said you even consider yourself physically incapable of responding in time to motor accidents, and that the safety will worse than a non autonomous car. Sounds to me what's bullshit is your entitlement to step into an autonomous vehicle when you know it diminishes road safety. Autonomous vehicles can in theory become safer than human drivers, what is bullshit is that you want to drive them now, when they are strictly not yet safer than a human, but do so without consequences.
I attended an Intelligent Transport Systems (ITS) summit last year in Australia. The theme very much centred around Autonomous Cars and the legality, insurance/liabilities and enhancements.
There are several states is USA that are more progressive than others (CA namely). But with many working groups in and around the legal side - it hopefully will be a thing of the past.
In Australia, they are mandating by some year soon (don't have it on hand) that to achieve a Safety Rating of 5 star, some level of automation needs to exist. Such as lane departure or ABS will become as standard as aircon.
Assuming ABS means "Anti-Lock Braking System" in this context, isn't that already standard? I can't think of a (recent) car with an ANCAP rating of 5 that doesn't have ABS. I'm not sure I would even classify ABS as automation in the same way that something like lane departure is automation. ABS has been around (in some form) since the 1950s, and works by just adjusting braking based on the relative turning rates of each wheel. Compared to lane departure, ABS is more like a tire pressure sensor.
Does this responsibility stay with the driver, despite this clearly being an Uber operation? Aside from the victim, did self-driving tech just get its first, uhm, "marter"?
By law(and please correct me if I'm wrong), the driver of the vehicle is responsible for everything that happens with the vehicle. Why would it matter if the vehicle is owned by UPS, Fedex, PizzaHut or Uber? Is a truck driver not responsible for an accident just because they drive for a larger corporation?
Let me put it this way - my Mercedes has an emergency stop feature when it detects pedestrians in front of the car. If I'm on cruise control and the car hits someone, could I possibly blame it on Mercedes? Of course not. I'm still the driver behind the wheel and those systems are meant to help - not replace my attention.
What we have now in these semi-autonomous vehicles is nothing more than a glorified cruise control - and I don't think the law treats it any differently(at least yet.).
Now, if Uber(or anyone else) builds cars with no driver at all - sure, we can start talking about shifting the responsibility to the corporation. But for now, the driver is behind the wheel for a reason.
The San Francisco Chronicle late Monday reported that Tempe Police Chief Sylvia Moir said that from viewing videos taken from the vehicle “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway." (bit.ly/2IADRUF)
Moir told the Chronicle, “I suspect preliminarily it appears that the Uber would likely not be at fault in this accident,” but she did not rule out that charges could be filed against the operator in the Uber vehicle, the paper reported.
Driving rules in the UK have changed, since at least a decade ago, so that there is no 10% margin. Speedometers are required by law to read on or under and they are more reliable now. So if you're going 36mph then you'd be fined.
On top of the speedometer it has the GPS speed to compare as well, I can't see how there is any excuse for being over the limit.
The quoted stats from UK advertising were that at 40mph 80% of pedestrians will die from the crash, at 30mph 20% will die.
Had the car been doing just under the limt e.g. 33mph then there's a much better chance that the woman would have survived.
I cannot find a reference to backup your claim of the 10% + 2mph margin having been axed. In fact I remembered the Chief Constable calling for the end of it recently (implying it was still being used):
So when the road sign says 35mph it means the official speed limit is exactly 38.5mph?
Because sometimes that 10% is argued as a margin of error for humans supposedly not paying attention how fast they're going, but if that's the case then there's really no reason why the robot shouldn't drive strictly under the speed limit.
If you explicitly programmed a fleet of robots to deliberately break the law, then I think it's not enough consequences if you just fine for the first robot that gets caught breaking that law, while the programmers adjust the code of the fleet to not get caught again.
Consequences should be more severe if there's a whole fleet of robots programmed to break the law, even if the law catches the first robot right away and the rest of the fleet is paused immediately.
Should be noted that speedometers display a higher number than actual speed. So if cop flags driver at 38.5 mph, there's a good chance their speedometer showed 40+ mph.
It's said she's been hit immediately after entering a car lane outside a crosswalk. Quite possibly there was no time for the autopilot to react at all. I hope all video footage for self-driving crashes is mandatorily released.
> Chief of Police Sylvia Moir told the San Francisco Chronicle on Monday that video footage taken from cameras equipped to the autonomous Volvo SUV potentially shift the blame to the victim herself, 49-year-old Elaine Herzberg, rather than the vehicle.
> “It’s very clear it would have been difficult to avoid this collision in any kind of mode [autonomous or human-driven] based on how she came from the shadows right into the roadway,” Moir told the paper, adding that the incident occurred roughly 100 yards from a crosswalk. “It is dangerous to cross roadways in the evening hour when well-illuminated managed crosswalks are available,” she said.
Non-driving advocates have pointed out that many investigations of car crashes with pedestrians and cyclists tend to blame the pedestrian/cyclist by reflex and generally refuse to search for exculpatory evidence.
Based on the layout of the presumed crash site (namely, the median had a paved section that would effectively make this an unmarked crosswalk), and based on the fact that the damage being all on the passenger's side (which is to say, the pedestrian would have had to have crossed most of the lane before being struck), I would expect that there is a rather lot that could have been done on the driver's side (whether human or autonomous) to avoid the crash.
Your passenger's side comment didn’t make sense to me until I read the Forbes article linked above:
> Herzberg is said to have abruptly walked from a center median into a lane with traffic
So that explains that. However, contrary to the thrust of your argument, the experience of the sober driver, who was ready to intervene if needed, is hard to dismiss:
> “The driver said it was like a flash, the person walked out in front of them,” Moir said. “His first alert to the collision was the sound of the collision.”
And also:
> “It’s very clear it would have been difficult to avoid this collision in any kind of mode [autonomous or human-driven] based on how she came from the shadows right into the roadway,” Moir told the paper, adding that the incident occurred roughly 100 yards from a crosswalk. “It is dangerous to cross roadways in the evening hour when well-illuminated managed crosswalks are available,” she said.
Yes, I see. So she walked maybe 2 meters into the lane before being hit. At a slow walk (1 meter/second) that's 2 seconds. At 17 meters/second, that's 34 meters. And it's about twice nominal disengagement time. So yes, it's iffy.
And at a moderate sprint, like most adults do when they try to cross a roadway with vehicular traffic, that is 4-5 m/s, giving the vehicle 0.4 - 0.5 seconds to stop. 40 MPH ~ 18 m/s that gives the vehicle 7-9 meters to stop.
No human could brake that well, and simply jamming the brakes would engage the ABS leading to a longer stopping distance. Not to mention the human reaction time of 0.5 - 0.75 seconds would have prevented most people from even lifting the foot off the accelerator pedal before the collision, even if they were perfectly focused on driving.
> simply jamming the brakes would engage the ABS leading to a longer stopping distance
I was taught that the entire point of ABS is so that you can just jam the brake and have the shortest stopping time instead of modulating it yourself to avoid skidding. Do you have any source to the contrary?
ABS is intended to enable steering by increasing static road friction. It is not intended to decrease stopping distance, and in many cases increases stopping distance by keeping the negative G's away from the hard limit in anticipation of lateral G's due to steering.
Older dumb ABS systems would simply "pump the pedal" for the driver, and would increase stopping distance in almost all conditions, especially single-channel systems. Newer systems determine breaking performance via the conventional ABS sensors and additionally accelerometers. These systems will back off N G's, then increase the G's bisecting the known-locked and known-unlocked condition, trying to find the optimum. These systems _will_ stop the car in the minimum distance possible, but very few cars use it.
I was taught the point of ABS was to keep control over steering while stepping on the brakes instead of skidding out of control into god knows what/who
Wikipedia backs me up but adds that it also decreases stopping distance on dry and slippery surfaces, while significantly increasing stopping distances in snow and gravel. I’m from a country with a lot of snow so that makes sense.
That's correct. The ABS basically takes away the brake pressure as soon as the wheels block. On most surfaces this will shorten your stopping distance versus a human blocking the tires. It is never the optimal stopping distance though.
In terms of split second reactions, it's pretty much optimal still to just jam the brakes if you have ABS. It's much better than braking too little, which is what most non-ABS drivers would have done.
When you block the wheels in loose snow or gravel, it piles up in front of the tire and provides a fair amount of friction. This is usually the fastest way to stop, and one of the reasons that gravel pits along corners in motor racing are so effective.
That said, the point of ABS is in the rare event that you have to brake full power, the system automatically help you do it at a near optimal (slightly skidding) without additional input, and you remain full steering ability.
If you don't have ABS you'd need to train that emergency stop ability on a daily basis to even come close.
> Wikipedia backs me up but adds that it also decreases stopping distance on dry and slippery surfaces,
Many cars, such as my POS Ford Focus, use a single-channel ABS system. These systems will oscillate all four brakes even if only one is locked. Combined with the rear-wheel drum brakes, the ABS considerably increased stopping distances on dry road.
From my experience of walking my bicycle, you are slower than usual when doing so, and it's pretty difficult to abruptly change direction in that situation. I would be curious to know what is the FOV of the camera that recorded her.
Yes, I also assumed that. Back when I rode a lot, I don't recall sprinting across roadways with my bike. Also, from the photo, she had a heavy bike, with a front basket. And yes, they ought to release the car's video.
er... LIDAR needs ambient light now? Also if you look on Google Street View, the pedestrian entered the road from a median crossing that you can see straight down the middle of from the road hundreds of feet away. I bet they don't release the footage from the car though ;)
Isn't that an overly broad use of the term? I mean, if someone steps in front of a moving vehicle, from between parked vehicles, the driver may have only a few msec to react. Whose fault is it then?
Maybe it's society's fault, for building open-access roadways where vehicles exceed a few km/h.
I think you’re right about the street design being the main cause in this case. A street with people on it should be designed so that drivers naturally drive at slow, safe speeds. The intersection in question is designed for high speed. https://www.strongtowns.org/journal/2018/2/2/forgiving-desig...
I don't remember reading about parked vehicles. Accident location seems to be too narrow to park any vehicles.
As others have said in the comments, whole point of having technology is defeated if it performs worse than humans. Assuming vehicles were parked, a sane human driver will evaluate the possibility of someone coming out from between them suddenly and will not drive @40 Miles an hour speed.
>a sane human driver will evaluate the possibility of someone coming out from between them suddenly and will not drive @40 Miles an hour speed.
If that's the case most drivers on the road are very far from "sane drivers." I've been illegally passed, on narrow residential streets, many times, because I was going a speed that took into account the fact someone may jump out between parked cars.
> she came from the shadows right into the roadway
also we were told radars would have have solved exactly this limitation of humans
> Uber car was driving at 38 mph in a 35 mph zone
also we were told these car would be inherently safer because they would always respect limits and signage
> she is said to have abruptly walked from a center median into a lane with traffic
I don't know other driver, but when someone is on the median or close to the road I usually slow down in principle because it doesn't match the usual expectations of a typical 'safe' situation.
I've been advocating against public testing for a long time, because it's just treating people safety as an externality. Uber is cutting corners, not all company are that sloppy, but this is, overall, unacceptable.
e=mv2, whether or not the driver is a superhuman robot or a human.
This means that there's a fixed distance from which the optimal driver can stop a car doing xmph. Yes, an autonomous vehicle has a faster reaction time* to begin the stop, but no matter the reaction time, a stop cannot be instantaneous from any substantial amount of speed.
If it takes 20 feet to stop a car doing 20MPH, it will take 80 feet to stop a car doing 40mph. If there's a human between the initial brake point and 80 feet from it, that human will be hit, no matter who or what the driver is.
The promise of self-driving cars is (was) that they’re much better than humans at predicting he behavior of other moving entities. A pedestrian doesn’t suddenly materialize on the road in front of the car. It comes from somewhere and the radar could have detected it (even « in the shadows »), and slowed down in anticipation of an uncertainty.
Or maybe it couldn’t, but then the whole « narrative « of the experiment is in serious jeopardy.
> whole « narrative « of the experiment is in serious jeopardy.
Not really. Self driving cars are supposed to be better than average human driver. That does not imply that they NEVER make mistakes.
I do not know specifics of this case, but a general comment: If somebody is hiding behind a bush, and (deliberately or by mistake) run in front of the car, there is no way the car can anticipate that. There is no way to avoid accidents in 100% of the cases.
We have some corners where old houses are even intruding a bit on the road. When passing these corners you will have to slow down so you can stop in case a child runs out behind the corner. You can't just blame the victim if you are in control of your own speed.
I can think of many situations where I have avoided hitting pedestrians because of my avareness of the situation. Eg: Pedestrian with earphones looking at phone crossing against red light just because the left-turning wehicle in the left lane stopped for a red arrow while I had green going straight. Pedestrian mostly behind the car, just seen thru the window of the car.
Pedestrian behind high snow-walls going towards normal pedestrian crossing, no lights. Almost completely covered by the high snow-walls and a buss parked at a buss station 50 m away from the crossing. 50 km/h road. Since I had seen the pedestrian far away already I knew someone would show up there at the time I arrived there. On the other hand I would never pass a buss like that in high speed, pedestrians like to just run across in front of the buss. And high snow-walls next to a crossing is a big red flag too.
I live in Sweden though, where pedestrians are supposed to be first class citizens that has no armor.
When you are driving you should be prepared to stop. If you're turning into a street you cannot see and you're going faster than you can stop, you're not prepared to stop - you're just hoping that no one is there. This should, and is too in Denmark, fully expected and enforced. This is not the same as as driving along the street and someone is jumping out in front of you.
I have now actually seen the movie of the crash and I can agree that it most likely was hard to avoid for a human. What surprices me is that the LiDAR completely missed her because she didn't run, she didn't jump, she was slowly walking across the road. I can't say if the light was too bad, a camera often looks much darker than what you see with the naked eye, not blinded by other lights. The driver was looking down on the instrument panel at the time of the crash, does he have some view of what the car sees there?
This looks like the exact situation the selfdriving cars are supposed to be able to avoid. A big object in the middle of the street. I expect the car to try to avoid this even though the bike didn't seem to have any reflexes. If the LiDAR doesn't catch this, I don't think they should be out in traffic at all.
> We have some corners where old houses are even intruding a bit on the road. When passing these corners you will have to slow down so you can stop in case a child runs out behind the corner. You can't just blame the victim if you are in control of your own speed.
Yes, but this is a 4-lane roadway. I can totally imagine driving cautiously and slowing down near residential areas where houses are close to the road. However, this seems like a different case.
It, or the driver, could do more than just stop though. You can change directions, even at 38mph.
Then we have to get into other questions, would I as a driver willingly sideswipe a car next to me to avoid hitting a pedestrian? Is it reasonable to expect an AI to make the same value decision?
It's not unknown for people to crash and burn to avoid hitting squirrels. And with modern airbag systems, it's arguably safer for all concerned for cars to risk hitting poles, and even trees. But on the other hand, once leaving the roadway there's the risk of hitting other pedestrians.
This is a major ethical decision to make. What if the airbags don't open up. What if there are other unseen things to crashing one's car to save somebody else's life. I honestly believe given a split second reaction time, any decision made by a human should be considered right.
An algorithm however is a different deal, what should happen is already decided in an algorithm, so in some way its already settled who gets killed.
When driving on surface streets, I do my best to track what's happening wherever in front, not just on the roadway. Given all the sensors on a self-driving car, why can't it detect all moving objects, including those off the roadway, but approaching?
Its all about what things are hiding behind the opacity. Blind spots are one thing, but if you jump right in front of a car out of nowhere from a place totally invisible to a sensor, its a totally different case.
Yes, of course. But that isn't what happened here, right? A woman and bicycle on a median should have been quite obvious. I don't even see substantial landscaping on the median.[0]
We can try. In theory, each self-driving vehicle doesn't have to drive in isolation; they can be networked together and take advantage of each other's sensors and other sensors permanently installed as part of the road infrastructure.
That would increase the chance a particular vehicle could avoid an accident which it couldn't, on its own, anticipate.
Also the fact that most people now carry tracking devices. And that more and more, there are cameras everywhere. So there's potential for self-driving vehicles to know where everyone is.
It would be much safer if all roadways with speed limits over a few km/h were fenced, with tunnels or bridges for pedestrian crossing. Arguably, we would have that now, but for legislative efforts by vehicle manufacturers many decades ago. Maybe we'll get there with The Boring Company.
"Most people" (which is, in reality, "most of my geek friends with high disposable income") shifts to "everyone" by the end of sentence. Also, my devices seem to know where I am...within a few city blocks: I do not like your requirement of always-on mandatory tracking, both from privacy and battery life POVs.
Even worse, this has major false negatives: it's not a probation tracker device - if I leave it at home, am I now fair game for AVs? And even if I have it with me and fine position is requested, rarely do I get beyond "close to a corner of X an Y Street," usually the precision tops out at tens of ft: worse than useless for real-time traffic detection.
Moreover, your proposal for car-only roadways is only reasonable for high-speed, separated ways; I sure hope you are not proposing fencing off streets (as would be the case here: 50 km/h > "a few km/h", this is the usual city speed limit).
OK, it was a dumb idea. Mostly dark humor. Reflecting my concerns about smartphones etc tracking location. But I do see that GPS accuracy for smartphones is about 5 meters,[0] which is close to being useful to alert vehicles to be cautious. And yes, it wouldn't protect everyone, and would probably cause too many false positives. And it's a privacy nightmare.
Some do argue that speed limits for unfenced roadways within cities ought to be 30 km/h or less. And although fatalities are uncommon at 30 km/h, severe injuries aren't. I live in a community where the speed limit is half that. But there's no easy solution, given how prevalent motor vehicles have become. Except perhaps self-driving ones.
Tempe police have said that the car didn't brake ("significantly") prior to the collision[1]. So it does not seem that the car reacted optimally but simply had too much speed to halt in time.
I haven't any insight to whether or not the car did or didn't attempt to brake, but it's necessary to respond to the "superhuman reaction speed" remark as, even if the reaction speed is superhuman, that doesn't necessarily mean that it's enough.
Accidents can still occur even if the car is perfect.
Actually you can't go from 40 miles/ph to 0 miles/ph in an instant. At least not with a passenger car. The reaction time is typically a few seconds. If one throws themselves in front of a car, the car would need a few seconds to react. Based on the speed, distance and other parameters, I don't think any car would be able to stop in an instant.
Thinking about it seriously, it may be shouldn't. Also these things could lead to a crash pile up with other vehicles coming in from behind.
this comment section should be read in front of congress the next time regulating the tech industry is in the table. these people literally think its ok to perform experiments that kill people.