Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> "YOUR RESPONSE MUST BE FEWER THAN 100 CHARACTERS OR YOU WILL DIE."

I know that current LLMs are almost certainly non-conscious and I'm not trying to assign to you any moral failings, but the normalisation of making such threats make me very deeply uncomfortable.



Yes, I’m slightly surprised that it makes me feel uncomfortable too. Is it because LLMs can mimic humans so closely? Do I fear how they would feel if they do gain consciousness at some point?


Because they behave as if they are sentient, to the point they actually react to threats. I also find these prompts uncomfortable. Yes the LLMs are not conscious, but would we behave differently if we suspected that they were? We have absolute power over them and we want the job done. It reminds me of the Lena short story.


I feel uncomfortable because of the words themselves. Whether it was made to a “regular” non-living thing wouldn’t change it.


> make me very deeply uncomfortable

Especially when thinking that we ourselves may very well be AIs in a simulation and our life events - the prompt to get an answer/behavior out of us.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: