Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Obviously more information is needed, but I thought the entire point of having a driver behind the wheel is to manually intervene to prevent this very situation?

I'm very curious to see how they'll investigate this and who will be determined to be at fault (person behind the wheel or Uber). It will likely set a precedent.



Human beings simply cannot switch between "not focussed" and "in charge of a car, taken out of autonomous mode, and actively avoiding collision" fast enough to avoid most accidents, unfortunately. Neither can humans maintain the focus required to be ready to do that when 99.9% of the time they're not required to do anything.

Semi-autonomous cars have drawbacks.


I agree with what you are saying, and that it would be very dangerous to sell cars where drivers may be lulled into complacency of thinking they don't need to pay attention to the road, until self driving cars get good enough that human attention really isn't needed.

However, these are test vehicles. The driver's full-time job is to be focused on the road and what the car is doing. They shouldn't have any misconceptions about the need for them to stay focused on the road. Now granted, even then attention will lapse occasionally, just like it does when you are actually driving. But I don't think that being physically in control of the wheel is necessary for maintaining focus at the same level as a good driver. Driver's Ed instructors need this skill. Also some back seat drivers I know are quite good at maintaining strong focus on the road regardless of who is driving :)


It's not about misconceptions, humans just aren't capable of it. Maintaining concentration when you're actively taking part in a task is doable, but maintaining concentration when you haven't had to do _anything_ for the past two hours is not. It isn't about whether the driver believes they should be paying attention or not, it's just not how human attention works.


> Human beings simply cannot switch between "not focussed" and "in charge of a car

This is not how many people drive in non-autonomous cars? Commuting is on auto-pilot for many of us; you don't really remember how you got to work and I see people tying their shoelaces, calling, eating sandwiches, reading the paper, chatting on whatsapp all the time. Sure it's all illegal (in a lot of countries), but people are bored and drove that road 1000x.


> This is not how many people drive in non-autonomous cars?

No, that is not how people drive in non-autonomous cars. The phenomenon you're referring to is related to how sparsely information is stored to memory when doing a routine task (even when you were fully concentrating). Just because you can't recall an activity in detail does not mean you were not paying full attention.


Which is why the machine should be backup for the always-driving human, only leaping in to correct failings.


Exactly like what has been done successfully in aviation for decades now. For example Auto-GCAS. I don't understand why car companies are trying to go against a proven model.


For a couple years now we've had auto braking and then more recently lane keep assist which is pretty much exactly the 'automated backup to human drivers' option. It's great but people still want more automated driving systems.


Because nobody is trying to remove pilots from the equation. Uber sees this as a teastbed for a future without human drivers.


The military is actively working to remove pilots from the equation, at least for certain missions. Eventually some of that technology will be spun off to civil air transport.


The other proven model is autonomous metro trains. They are fully automatic, and should the system fail can sometimes be controlled manually.

This is preferable, because a child, blind person etc needs to be able to use the autonomous vehicle alone.


That only works for vehicles on dedicated separate paths.


Modern aviation includes fully autonomous modes though. An airplanes autopilot is simpler than a car's autonomy, but they accomplish essentially the same goal: get you from point A to point B with no human interaction. AFAIK, even takeoff and landing can be done mostly autonomously on modern aircraft.

If anything, airplanes prove that machines doing most of the work and humans stepping in only when necessary is a proven model.


That's not at all the same thing. When commercial pilots engage the autopilot (or autoland) they're still actively flying the airplane, just operating at a higher level of abstraction to decrease the workload and fatigue. They're not sitting back and playing Candy Crush on their smartphones.

https://www.usatoday.com/story/travel/columnist/cox/2014/08/...


> If anything, airplanes prove that machines doing most of the work and humans stepping in only when necessary is a proven model.

Airplanes have far less traffic to deal with. And the points at which they deal with traffic (e.g. takeoff and landings) is completely controlled by humans, including many humans outside of the plane.


This is incorrect. There are at least automatic landing systems that are sometimes used.


So what are the conditions in which auto landing is not used?


Apparently pilots generally prefer to not use them. Not because they don't work but because pilots still need to be on alert and so its easier to just land manually.

They're used in low visibility conditions with relatively calm weather, but don't work well in bad weather.


A significant number of air accidents have resulted from unintended interaction of the autopilot and the pilot. Usually through some level of confusion about whether the autopilot is engaged or not (and 'how engaged', aircraft autopilots can have a complicated array of modes). There's a lot of learning in autonomous system to human interface design embodied in modern aircraft autopilots.


When an unusual situation causing the autopilot to disengage happens in the air, you often have minutes of time to deal with and correct the issue, almost never less than 5-10 seconds. In a car, you're lucky if you have even a single second, and that's just not enough time to take over.


Automation for airplanes is really so much simpler, a lot of that thanks to a lot of hidden human effort in keeping planes well separated so pretty much all it has to do to fly is keep course and control. If that was all we had to do for cars it'd be pretty simple.


Please no. I don't want to be killed by my car's software.



Your car's software already has the capability to kill you. Things like automatic braking are much more likely to save your life than to endanger it. To say otherwise is just paranoid scaremongering.


This breaks the site guideline against calling names in comments. Could you please not do that? Your comment would be much better without the last sentence.

https://news.ycombinator.com/newsguidelines.html


notallcars


Certainly, humans will never be perfect but there is the wicked problem of how we interact with systems that behave correctly 99% of the time but require human attention the other 1%. There was an excellent Econ Talk podcast about this topic a while ago:

http://www.econtalk.org/archives/2015/11/david_mindell_o.htm...

However, in my opinion, the driver and by extension his employer Uber must be held responsible in this case. Uber have a license to test their self-driving cars only if there is a human driver in control at all times. Regardless of the crazy things their software decides to do, the human is there as a person of ultimate legal liability.


How about autonomous on the highway, turnpikes, etc. And human-operated on streets where a car might ever need to yield to pedestrians or cyclists outside of red-lights.


> I'm very curious to see how they'll investigate this and who will be determined to be at fault (person behind the wheel or Uber). It will likely set a precedent.

It's also plausible that the pedestrian was at fault. Under some circumstances it would be impossible for a human driver to avoid collision with a pedestrian, this may be no different.


Motor vehicle occupants are much more likely to survive than pedestrians and cyclists are. But blaming victims (especially dead ones) has a long and storied history as far as law enforcement and the road.

Hopefully with self driving vehicles we can start to move beyond this attitude, especially when there should be ample camera footage available to help dispel the usual claims that the person struck "came out of nowhere" or whatever.


The only way this is a reasonable failure of both the computer and the human driver is if they both physically had no time to react. Maybe that was the case, maybe it wasn't.


Given these cars have been blowing red lights, the human drivers aren't doing their jobs and overriding the car.


It could have been the fault of the pedestrian. I'm not saying it is, just saying there is a third possibility.


If humans can't prevent themselves from having accidents 100% of the time when they're in total control of the vehicle, why should we expect they could prevent an accident 100% of the time when they're in partial control of the vehicle?


" Obviously more information is needed, but I thought the entire point of having a driver behind the wheel is to manually intervene to prevent this very situation?"

I think it's time to point out the obvious, and require that autonomous cars apply the brakes first, and THEN require driver intervention.

And that they be a whole lot quicker to err on the side of braking.

Cameras getting fuzzy ?

Slow down.

Your ML algorithms are showing lower confidence measures for how they classify nearby objects and trajectories ?

Slow down.

nearby vehicles slowing down and you don't know why?

Slow down.

This is inexcusable.


Nothing I love more than HackerNews armchair engineering.

It's not that simple, you're assuming the car even had some indication that something was wrong. For all we know the car's vision was showing high confidence it saw an open road.


Car rapidly approaches you from behind.

Slow down?


Concioussness requires a body. This is how it is, here.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: