Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm saddened by this incident but have thought a lot about this eventuality. there are a few levels of societal acceptance of self driving car death outcomes i can think of:

A - Human equivalent: self driving car obeys reasonable rules that a human also would eg minimum speed on a highway and kills someone

B - Trolley problem: (very artificial scenario) self driving car has to choose between killing N or M people where N > M and wrongly (in hindsight) chooses N

C - Car fault: self driving car kills someone in a situation where no human would have

As a society we would probably accept A easily but B starts to get shaky. C currently looks completely unacceptable, BUT I would argue that society has to get to a point where even C is ok conditional on the probable result that overall car deaths decline dramatically.

In other words we will have to get to a point where individual deaths are extremely regrettable but the overall death reduction of adopting self driving cars are so undeniable that the individual deaths can be discussed without also talking about banning SDCs in the same breath.



How does it work out? I mean you got a situation wher SDC kills someone where no human driver would, but somehow it gets safer?


yes, because in aggregate the deaths are still reduced. we need society to get past individual death stories (which are very sad and we should do everything we can to prevent it) and not let them distract from the overall need for SDCs




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: