Hacker News new | past | comments | ask | show | jobs | submit login

>The new data comes at a crucial time for the self-driving industry.

Which is by the way, a good reason to be skeptical of it. I remember talking to someone who worked with BMW on their self driving a long time ago and their take on Tesla's self driving effort was (a) It's fine for Tesla to have a bad reputation for safey but BMW simply can't choose to get a bad reputation for safety they sell far too many non-autonomous cars and (b) It's actually not fine for Tesla (and others) to be rushing ahead with self-driving because they will kill people and they're just as likely to kill the whole self-driving industry at the same time.

I have no doubt that even if the data looked terrible, Waymo would find a way to spin it to look safe. I also have no doubt that even if the data is good, it's not indicative of self-driving being safer in the average situation.




The way I’m reading the PR right now — Cruise damaged the public’s perception of self-driving cars by cover up. Now the ball is on Waymo’s court to over-prove that they’re safe and transparent. Unfortunately this is one of the problems where “vibes over facts” persist. As in, a lot of people will argue that a self-driving car killing 10 people is worse than real people killing 20 people. It’s not that easy to change public’s sentiment, especially when there’s a huge truckloads of money is on the table for multiple industries either to lose or gain at the end.


This is like saying a shitty doctor is okay because they kill less people than a faith healer. You have to hold professionals to a higher standard. This is not rocket science.


If the choice is solely between a shitty doctor and faith healer (picking none isn’t an option as people have to get from A to B somehow), then you would choose shitty doctor though, no? In the states, I just don’t see any other alternatives for the near future.

US is a beast where super majority of people have given up on public transport and infrastructure. Otherwise I would hold the bar higher and look into alternatives.


A shitty doctor might be ok if your only alternative is a faith healer. It depends on which one has better results (the shitty doctor can have worse results than the placebo faith healer).

In some places, this is a real choice to be made, and developed world luxuries of “holding professionals to higher standards” aren’t available.


Sir, our data analysis shows that you and your family would have been crushed under one of our trucks last month if it hadn't been for its autonomous driving software. Based on your internet commenting history you have expressed a preference to be crushed under the vagaries of human-spectrum incompetence rather than being subjected to soulless, less-than-perfect machine behavior offered by our corporate overlords.

We are here to offer you a correction of this regrettable mistake. Could you please call your family and follow us to the flesh compactor? We assure you that it is operated by a real, drunk human.


Wouldn't that claim of "only 3 minor injuries" be relatively easy to refute?

Sure, I'm not so naive to think there's no corporate spin, and especially with Alphabet's marketing and media resources and the need for positive sentiment towards Waymo, but I think if Waymo's safety record were anywhere near Tesla's (or Cruise's) it would be very hard to spin this well and not be refuted, they would be better off not giving an update in that case.


Absolutely not. Firstly, a huge number of these miles are either being done directly under test conditions or with employees as the passenger, and secondly any actual passengers have probably signed significant wiavers. Secondly, when accidents do happen I bet you that some legal department at Google is highly motivated to step in and "investigate" - ie, get everyone involved to sign NDAs and then conclude their investigation with some conclusion that doesn't impact their stats.

I'm not saying this is happening - but it would be entirely unsurprising to me if it were. And I think actually there are some rumours of Tesla engaging in that kind of behaviour - classifying crashes as being unrelated to Autopilot because Autopilot disengages in the moments leading up to the crash. How would anyone refute that? And if they tried, do you think they'd face significant legal jeopardy from Tesla.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: