Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Actually there was a metric of accident per distance drived. It was often pull out by Tesla before their first fatal accident last year (IA didn’t saw rear of truck because of sun or smthg like that).

This metric was often decried because it had poor statistical significance. It would however be nice to update this metric according to this death.

Maybe now this metric would indicate that IA are more dangerous than human, will be interesting if the perception of this flawed metric will evolve in reaction...



> Actually there was a metric of accident per distance drived. It was often pull out by Tesla before their first fatal accident last year (IA didn’t saw rear of truck because of sun or smthg like that).

I'm curious to see that, do you have a link? That would actually be a valid comparison.


I’m on my phone so maybe I haven’t found the most relevants links, but I found the followings.

This is Elon musk bragging before the first autopilot accident : https://www.telegraph.co.uk/technology/2016/04/25/elon-musk-...

This is a counterpoint with some fact: http://safer-america.com/safe-self-driving-cars/

And concerning the crash, last update seemed to point driver’s fault... with autopilot... don’t know if final ruling is already out:

https://www.washingtonpost.com/news/the-switch/wp/2017/06/20...


According to that "update", Tesla Autopilot can never be at fault right? Because the driver is supposed to be in charge. So, even if Tesla rams full speed into a semi, the driver is at fault!


Yes, that's how Level 2 systems (like Tesla Autopilot) work.


Neat, thanks for the data.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: