The problem is the blame game. Human behind a wheel hits a pedestrian, if they're found at fault then that "horrible inattentive/drunk/whatever driver" goes to jail for the death of another human being. It never makes anything "right", and I won't even start on how a jail sentence regardless of length can ruin your life in the US, but the public as a whole gets a feel good that "justice has been served".
How do we handle this for autonomous vehicles? Do we just fine/sue the company that made the vehicle/developed the software? Do we send imperfect human developers to jail because they made a mistake, even if in the grand scheme of things they have saved lives compared to humans being behind every action made by a vehicle?
A big part of the public image for autonomous cars is increased safety, any deaths at their hands starts raising where and how to place the blame - a subject I think very few are prepared for right now, which is likely part of why Tesla explicitly states autopilot needs a human driver present right now, and why Google has been extremely cautious with operator-supervised tests up until recently.
How do we handle this for autonomous vehicles? Do we just fine/sue the company that made the vehicle/developed the software? Do we send imperfect human developers to jail because they made a mistake, even if in the grand scheme of things they have saved lives compared to humans being behind every action made by a vehicle?
A big part of the public image for autonomous cars is increased safety, any deaths at their hands starts raising where and how to place the blame - a subject I think very few are prepared for right now, which is likely part of why Tesla explicitly states autopilot needs a human driver present right now, and why Google has been extremely cautious with operator-supervised tests up until recently.