Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Curious how the numbers work out of number of miles [1] to number of deaths of autonomous vs standard vehicles.

I know the scale is different but since this is the first death I’m curious if the percents fall in line.

[1] is distance the right metric?




This is an interesting metric. Although self driving test cars are rare, the whole point of their existence is to drive so they probably clock more hours than a normal car.

About 5400 pedestrians are killed each year in the US. US drivers go 3.1 trillion miles a year. So they kill a pedestrian about every 5.7 billion miles. Last November, Waymo said they had 4 million self driven miles, so well short of statistically expecting to hit a pedestrian. In September of some unspecified year Axios claims Uber had self driven over 1 million miles.

My estimate needs help, the distribution of pedestrian bearing roads and pedestrian free roads likely does not match from my total miles per year number and what carbot testers cover. Also, this may have been a cyclist death, which adds another 800 or so deaths per year.

But, in any event, in rough numbers, Uber appears to have beat their expected time to pedestrian fatality by two or three orders of magnitude.


It occurs to me that because the deaths caused by autonomous vehicles may not follow the same distribution across types of deaths, it might make more sense to compare total deaths per million miles between human and autonomous drivers.


It appears that the current figure is sitting around 10 vehicle deaths per billion vehicle miles travelled.

Which seems unbelievably low. I'm getting these figures from this Wikipedia graph[1]

[0] https://en.wikipedia.org/wiki/Transportation_safety_in_the_U... [1] https://en.wikipedia.org/wiki/File:US_traffic_deaths_per_VMT...


Wow.

If deaths are broken out by pedestrian vs passenger, I wonder how people will respond if safety skews heavily towards one of those groups with self driving cars.


Distance is the metric used so far but also isn't appropriate for comparing self-driving cars to human driven cars. The human driven bucket contains all miles driven, highway cruise control, light snow, heavy rain, tricky merges, etc. The self-driving metric is only the easiest possible miles. Overtime those miles will expand and harder scenarios will be incorporated but to really know if self-driving cars are safer we need apples to apples comparisons which is going to require matching humans vs robots on miles driven and a categorization of those miles, maybe a count of tricky unexpected scenarios as well.


Perhaps time as opposed to distance? After all, we drive much fewer miles in a city, but there are much greater opportunities for accidents, particularly pedestrian accidents.


Maybe deaths per speed per time (or distance, since speed over time is the same as distance at speed). More deaths at higher speeds and fewer at lower would seem more likely. Purely using time or distance may skew the interpretation, if there are more deaths at some speed ranges than others.


Yes, these statistics are typically based on miles driven.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: