Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I hand you a gun and say "you can point this gun at anyone, pull the trigger, and it wont harm them." You point the gun at a crowd, pull the trigger, it goes off and kills someone.

Who do you blame?

No one is saying guns, or cars, aren't dangerous. However this kind of false advertising leads people to use cars in dangerous ways while believing they are safe because they've been lied to.

(Side note, there is some fine print you never read that says the safe-gun technology only works when you point the gun at a single person and doesn't work on crowds.)




Safer than human hardware does not imply safe.

It means this will kill people but fewer people than humans would and they have actual data that backs this assertion up.

The benchmark is not an alert, cautious, and sober driver because that's often not how actual people drive. So right now it's often safer to drive other times it really is safer to use autopilot, net result fewer people die.


But if you do happen to be an alert, cautious, and sober driver, it'd be unfortunate if Tesla's marketing led you to overly rely on Autopilot.

Ideally Tesla's marketing would say it's safer than drunk, sleepy, and reckless drivers, though it might not sell as many Autopilots.


Autopilot should not be accessible to drivers until their driving habits have been assessed and baselined. If autopilot is known to drive more safely than the individual human driver in environments X, Y, and Z then it should be made available in those environments, if not encouraged. That might not make an easy sell since a major value prop is inaccessible to many of the people who really want to use it, but it's the most reasonable, safest path.

I also imagine that cars will learn from both human and each others' driving patterns over time, which (under effective guidance) should enable an enormous improvement in a relatively short period of time.


Drunk, sleepy and reckless drives from an age bracket not normally buying premium sedans.


I can grab the data, but the surprising thing about fatal crashes are the typical circumstance. It's an experienced driver driving during the day, in good weather, on a highway, in a sedan style car, sober. There are two ways to interpret this data. The first is we assume that crashes are semi-randomly distributed, so since this is probably the most typical driving condition it therefore naturally follows that that's where we'd expect to see the most fatalities.

However, I take the other interpretation. I don't think crashes are randomly distributed. That that scenario is absolutely perfect is really the problem because of humans. All a crash takes is a second of attention lapse at a really bad time. And in such perfect circumstances we get bored and take road safety for granted as opposed to driving at night or navigating around some tricky curves. And that's a perfect scenario to end up getting yourself killed in. And more importantly here, that's also the absolutely perfect scenario for self driving vehicles who will drive under such conditions far better than humans simply because it's extremely trivial, but they will never suffer for boredome or lack of attention.


Their claim is about hardware, not software.


Just about every expert agrees though that their hardware is NOT capable without LiDAR (which is probably why they churn through people running their Autopilot program), although proving that in court is a whole thing.


That is not a distinction most people understand.


Or maybe there is no basis for making such a claim.

Or well, I just put four cameras on my car and inserted their USB wires into an orange. I declare that this is enough for FSD now. I don't have to prove anything for such an absurd statement?


And how do you know that the present hardware is enough to deliver full self-driving capability? I don't think anyone knows what precise hardware is required at this point as it wasn't achieved yet.

So, do you think it may be possible that people do understand the distinction and are STILL not convinced?


Well, humans have 2 cameras that they are able to rotate & a brain and that seems to be sufficient for them to drive in terms of hardware... :-)


I, and the law would blame the person. They should have no particular reason to believe your claim and absolutely no reason to perform a potentially lethal experiment on people based on your claim.

You may be breaking the law as well, depending on the gun laws in your area, but as I understand it (IANAL) the manslaughter charge of entirely on the shooter.


I would blame the person, because the best case scenario of shooting an allegedly harmless gun into a crowd is equivalent in effect to the worst case scenario of doing nothing in the first place.


I noticed this in the news... I can't believe someone actually tryed this.

Tesla owner faces 18-month ban for leaving the driver's seat https://www.engadget.com/2018/04/29/tesla-owner-faces-road-b...




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: