Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Depending on driverless taxis instead of having emotional people who need parking all the time seems to be a win, as long as they are safe and someone is held accountable when they mess up.

I am very unwilling to change 'people are flawed but (mostly) held accountable' to 'a corporation owned robot car ran someone over because of a software glitch and it is nobody's fault'. The problems with 'oops we got hacked or lost your data or locked you out of a service or deprecated a product because we spend no money on things that are not income generating' can not be passed on to such a system.



Suppose it's the near future and driverless cars are causing 5x fewer injuries per mile than human drivers. (5x fewer != 0, so some injuries and deaths still occur)

What accountability would you like to see in that situation?

Or a table of

  | relative death rate | accountability |
would cover the widest range of future scenarios.


If there is an investigation and if it is found that someone said 'don't bother with those tests just push to market' or 'let's fire the legacy system patching team because they make no money even though the cars are still driving around' and that caused directly or indirectly a car to glitch and run someone over or to get hacked and used for a crime then someone goes to prison and/or the company gets liquidated or something in between that is reasonable to ensure that it is not tempting for another company to do it again.

I am not a lawyer or a legislator and I cannot come with a regulation or law or something that would do this, but I am sure someone can.


It's a lot easier to hold a single corporation accountible than it is for the distributed responsibility of individuals. Uber killed someone and they had to leave the industry. Tesla is being actively investigated for their fly by night approach.

Additionally, when one of these cars fucks up, these companies know exactly what happened because they are collecting data about the cars performance 24/7.


> Additionally, when one of these cars fucks up, these companies know exactly what happened because they are collecting data about the cars performance 24/7.

And if it turns out they had data that showed that they knew someone was gonna get killed, but they would make more money not fixing the bug -- I would want someone to go to jail.


Then no one is going to implement it probably. Or is it just jail time when dead are more than with humans?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: