Hacker News new | past | comments | ask | show | jobs | submit login

You think if it was failing any significant amount to drive in a safe way, with this wide deployment, there wouldn't be a lot of crashes? Having an AI drive is the best way to make the driver zone out. At this point, it usually fails by going into intersections a little too slowly.



There's entire YouTube channels dedicated to videos of Tesla Self Driving doing stupid things like trying to turn into a tram. At scale, 99.99% correct will still kill many thousands of people. Compared to the sheer volume of cars, there aren't actually that many Teslas out there.


I enjoy videos on the self driving space and Tesla's technical (not business) approach to it. Its produced results that were actually quite a bit better than I expected at this stage.

I still regularly see videos of Tesla's beta software attempting to pull out into traffic in situations that clearly could have very bad outcomes. I still see so much phantom braking that its a collision risk.

I wouldn't call it dangerous, in the sense that it is done well enough that the person at the wheel should be able to handle it, but it'd crash a lot without an attentive driver.

Its a long way from 99% reliability at this point.


The driver won't zone out if it is only getting it correct a tiny fraction of the time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: