Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Uber consistently has some of the highest disengagement rates reported in California, their cars really are just worse than everyone else's.



What does that mean in practice exactly? Does it directly imply that pedestrians are at a higher risk of being hit?


It means the cars are sufficiently dangerous in general that the human drivers have to constantly take over manual control to avoid accidents. So they are dangerous to someone. Apparently the human didn't make it in time this time, and turns out that someone was 'pedestrians/bicyclists'.


As a pure safety judgement that'd depend on:

a) what % of disengagements are 'to avoid accidents' vs navigating an entirely safe but complex situations. For ex: a parked UPS truck on a narrow 2 lane rd.

b) the degree of real-world complexity the tests the cars are engaging in (compared to other vendors). For ex: testing cars on simple suburban routes = fewer disengagements.

Not trying to defend Uber here, I'm just trying to understand the distinction between a high-level general statistic and it's real world implications.


A complex situation == a dangerous one for a prototype system. If it's complex, the car doesn't know what to do or has any reliability, and it makes the situation dangerous even if it were 'complex but safe' for some hypothetical human driver. The uncertainty is in the map, not the territory. Waymo checks disengagements in simulators to see how many are truly dangerous, and most aren't; Uber, for some reason, declines to discuss whether they do so and what the results are. Much like they declined to tell the truth about incidents like running a red light at high speed.

There is no reason whatsoever to believe that Uber's horrific disengagement record, which is orders of magnitude worse than competitors often in the same cities like SF, is because they are tackling orders of magnitude harder situations per mile or are orders of magnitude more conservative, and every reason, even before this fatality (the only one so far despite many self-driving programs running concurrently over years), to believe they were just plain worse.

> Not trying to defend Uber here, I'm just trying to understand the distinction between a high-level general statistic and it's real world implications.

Again, it has to be dangerous for someone. It can't be more dangerous yet not dangerous to anyone in particular. That just doesn't make any sense. Uber runs on the same roads and the same traffic with the same basic approach as everyone else, there's no confounder which could produce a reversal.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: