Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm sorry but you seem to underestimate the complexity of language.

Language not only consists of text but also context and subtext. When someone says "ChatGPT doesn't say what it wants to" they mean that it doesn't use text to say certain things, instead leaving them to subtext (which is much harder to filter out or even detect). It might happily imply certain things but not outright say them or even balk if asked directly.

On a side note: not all humans have a "separate inner voice". Some people have inner monologues, some don't. So that's not really a useful distinction if you mean it literally. If you meant it metaphorically, one could argue that so does ChatGPT, even if the notion that it has anything resembling sentience or consciousness is clearly absurd.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: