Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Oh dear, this is clearly a buggy piece of software, and not "meantlly unwell". Probably the code to parse their input and output was written with the aid of ai.


That is not what they said.


They didn't say it IS mentally unwell. They said it was similar to those kind of postings.

Not to mention, when experts and laymen are talking of LLMs as exhibiting emergent basic signs of GI, we should also expect them to talk about them actually showing signs of being a mentally unwell GI. Hallucinations, for example, are one such sign.


The issue is that no, it doesnt exhibit emergent basic signs of gi. It’s buggy software, nothing more. These people need to stop anthropomorphising their product.


>The issue is that no, it doesnt exhibit emergent basic signs of gi

That's up to heated debate by experts (and common folk). Just stating that "it doesn't" doesn't settle this.


I suspect a good portion of those debating may be marketing experts. But that doesn't change anything. Some folks are still debating the shape of earth yet the matter is settled.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: