You are seeing the Trolley Problem Dilemma[https://en.wikipedia.org/wiki/Trolley_problem] in action. Majority of respondents when asked that problem would choose to let a few people die to save a larger number of people. Emphasis being on "let a few people die" and not "kill a few people", machines (i.e. lever in that problem and autonomous car in this scenario) somehow evoke a more logical and less emotional response, I guess machines create an emotional distance between the people who would die and the respondent. If the same respondents would somehow feel more responsible for the death, i.e. the machine is removed from the scenario the response would be much different.
Not implying that the lack of empathy is in anyway acceptable, but people should know that when machines are involved in such scenarios it is much easier for them to distance themselves emotionally, this is not new, weapons are a prime example of this. Maybe by knowing this dilemma people can react better to such tragedies.
Not implying that the lack of empathy is in anyway acceptable, but people should know that when machines are involved in such scenarios it is much easier for them to distance themselves emotionally, this is not new, weapons are a prime example of this. Maybe by knowing this dilemma people can react better to such tragedies.
I hope people show more empathy knowing this.