Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It's kind of a wild sign of the times to see a tech company issue this kind of post mortem about a flaw in its tech leading to "emotional over-reliance, or risky behavior"

Their intended users are basically everyone in society--people who a low in many types of intelligence, including social intelligence. People with a range of emotional and mental health challenges. Children. Naive users who anthropomorphize AI, and can't be expected to resist this impulse at every moment of interaction.

They aren't designing exclusively for very bright, emotionally adjusted, people deeply familiar with technology, working in a professional context.

You might find it "wild" that they are consider all of this. That's fine. But a company sensibly is doing this work ought to avoid hiring you.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: