Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think the reality is somewhere in the middle. You need to be able to accurately predict behavior of humans _following some conventions_, and to be wary of the behavior of humans when they violate those conventions.

An example I saw:

- At the start of a construction area, a guy wearing a hi-viz vest holds a stop sign. A self-driving car stops at the sign.

- The guy _lowers_ the sign a bit while looking over his shoulder down the street towards others on his crew.

At this point a _human_ guesses that the sign is lowered only b/c the guy has seen that the car stopped, and expects the car to stay stopped until some further signal (e.g. a waving gesture, or flipping the sign to show the "slow" side). The human driver understands that stop sign guy is looking to coordinate with someone else nearby. There's a "script" for this kind of interaction.

... but the self-driving car starts moving as soon as the road crew guy lowers the sign. In this case nothing seriously bad happened. But it was not following The Conventions.

This doesn't take full general intelligence perhaps -- but it takes some greater reasoning about what people are doing than the cars seem to have currently, and so sometimes they drive into a zone that the fire department is actively using to fight a fire, and get in the way.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: