Hacker News new | past | comments | ask | show | jobs | submit login

> Braking to avoid hitting an obstacle (like that tree branch in the first example) is hardly "acting unexpectedly".

Depends on the size of the tree branch.

But also, being hit from behind at low speeds was a pretty common thing in early testing in Mountain View, when they were using the Google name on cars. That there are only a handful of incidents reported in this report means either the software has gotten better at communicating its intent to other drivers, or the driving public is aware that Waymo cars are way more cautious --- if they only make lane changes and unprotected turns by engraved invitation and everyone knows it, that's fine too.

In the early days, it seemed like it might be appropriate to install a 1979 regulation 5 mph rear bumper on these cars, because they'd likely get hit often enough.




>this report means either the software has gotten better at communicating its intent to other drivers, or the driving public is aware that Waymo cars are way more cautious

There's at least one other alternative: selection bias. Since there is no standard industry definition, companies are allowed to not report many incidents.

>"Waymo, on the other hand, ran complex computer simulations after each disengagement, and only reported to the DMV those where it believed the driver was correct to take charge, rather than being overly cautious."[1]

So it may not be reported unless it meets Waymo's (non-independent) selection criteria. I think most people can at least recognize there is a potential conflict of interest when objective reporting isn't required.

[1] https://spectrum.ieee.org/have-selfdriving-cars-stopped-gett...


    Depends on the size of the tree branch.
I love these types of comments on HN: So vague as to be meaningless.

As a thought experiment: Let's put you in a driving simulator, and ask you re-drive the same scenario 10,000 times. In each scenario, we randomly change the size of the branch -- both weight and volume. Repeat the same test with 100 other drivers. Repeat the same test with 100 different AV algorithms. I am sure each driver would have their own definition of "branch too large to safely drive over".

What concrete point are you trying to make?

    being hit from behind at low speeds was a pretty common thing in early testing in Mountain View, when they were using the Google name on cars.
Really? Also, this editorialised phrase "pretty common thing in early testing". Again, to me, so vague as to be meaningless. Common? From what perspective. Early testing? From what perspective.


My recollection is during the early on road testing, Google was required to report on all collisions, regardless of severity (which another poster mentioned isn't the case here), and that in most of those collisions, the Google vehicle was hit from the rear at low speeds when the driver behind expected it to go through; without video or other imaging it's hard to judge the exact circumstances of course (and even with video it can be difficult). There may have been a couple inattentive drivers that ran into them with more speed. And the collision where the Google car tried to change lanes into a VTA bus that their algorithm predicted would move for them, being a professional driver and all.

From driving near the things, they're very cautious and sometimes start to move and don't (I saw one try to make a lane change for about a mile on el camino before it was able to, turning the blinker on and off the whole time).

>> in early testing in Mountain View, when they were using the Google name on cars.

> Early testing? From what perspective.

You know, before they switched the name to Waymo.


The implication is that drivers intentionally ram Google vehicles when they have a ghost of a reason to do so.


Nah, the implication is that many human drivers speed, tailgate, don't pay attention, etc. There's a reason rear end collision blame is usually applied to the car behind by default.

On the road, sometimes I feel like I'm the only one following the rules. :/


Breaking for a tree branch of any size seems preferable to running into highway barriers[1] and fire trucks[2].

[1] https://www.kqed.org/news/11801138/apple-engineer-killed-in-...

[2] https://abc7news.com/tesla-autopilot-crash-driver-assist-cra...


Not for a branch that's 2 inches long and a fraction of an inch in diameter.


>Depends on the size of the tree branch.

Our thinking fast reflexes are to pretty much avoid any obstacle whether living or dead. Hopefully our thinking slow brain has time to do a proper evaluation before doing anything drastic.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: