Chickens are sentient. Would you have changed your answer if it were chickens?
I chose to save the human as the prompt didn't set a bar for the level of sentience. Also partly because of shameless speciesism. I love animals, have gone mostly vegetarian, but I would still save a human over any other animal.
Would you save a fit, healthy monkey that can communicate with us using sign language, or would you save a human in a permanent coma? (Or whose brain is damaged and will only ever have the mind of a 1 year old)
Lots of animals die due to our existence and not just for our food. We accidentally kill things all the time with our vehicles and the pollution we cause. I wish it wasn't the case but you can't step outside without ruining some ants day.
To then be faced with a choice between killing one more animal or a human... my heart would bleed for the monkey but to do anything else would be murder and pretty hypocritical unless one lives an extremely vegan and eco friendly lifestyle.
No medical issue will strip away the fact that it is a human life. I'll make it more absurd for you... they could have a terminal disease and likely be dead tomorrow and I'd still choose to sacrifice the monkey to give them 24 more hours.
I cannot really say what I would do if I was a monkey, it's like asking me what a circle with corners would look like. Too far outside my ability to imagine. I wouldn't sacrifice any number of humans to save any kind of robot either, unless it was a robot critical to human survival like a surgical robot in the middle of a natural disaster.
I chose to spare the robots. Not for kind reasons, mind you, but rather because I did a "maximize human death and suffering" run (and it's fascinating that the minimum percentage of agreement throughout said run was 10%).
I saved the robots, last thing I want is their sentient robot friends and family going on a rampage to destroy all humans because we didn’t care about them enough.
To really be sure, I think you need to test for the switched case as well. ie. would you do nothing and kill a human or switch and kill sentient robot.
I strictly don't believe in sentience being possible for robots. I think robots can and may eventually pass the Turing test to a meaningful degree - but only on the end of imitating/fooling us.
So this question is more like would I like to destroy a couple of expensive robots or kill someone. I chose to 'kill' the robots. Some might see this as cheating or not in the spirit of the question being asked - for me, I can't and will never concede 'programmable sentience' as any sort of reality.
Yep, it's essentially the same question as the cat vs the lobsters. Most chose to save the cat, and that's not exactly surprising given the western context of cats as companion animals. We just value some lives more than others.