What's different is intention. A human would have the intention to blackmail, and then proceed toward that goal. If the output was a love letter instead of blackmail, the human would either be confused or psychotic. LLMs have no intentions. They just stitch together a response.
intention is what exactly? It's the set of options you imagine you have based on your belief system, and ultimately you make a choice from there. That can also be replicated in LLMs with a well described system prompt. Sure, I will admit that humans are more complex than the context of a system prompt, but the idea is not too far.
The personification makes me roll my eyes too, but it's kind of a philosophical question. What is agency really? Can you prove that our universe is not a simulation, and if it is then then do we no longer have intention? In many ways we are code running a program.