Oh dear, this is clearly a buggy piece of software, and not "meantlly unwell". Probably the code to parse their input and output was written with the aid of ai.
They didn't say it IS mentally unwell. They said it was similar to those kind of postings.
Not to mention, when experts and laymen are talking of LLMs as exhibiting emergent basic signs of GI, we should also expect them to talk about them actually showing signs of being a mentally unwell GI. Hallucinations, for example, are one such sign.
The issue is that no, it doesnt exhibit emergent basic signs of gi. It’s buggy software, nothing more. These people need to stop anthropomorphising their product.
I suspect a good portion of those debating may be marketing experts. But that doesn't change anything. Some folks are still debating the shape of earth yet the matter is settled.