In some dimensions, the current crop can achieve this. See the Google medical AI that gets scored better for bedside manner than MDs.
It’s not what we would have predicted pre-GPT, but I think it’s plausible that LLMs will be superhuman in empathy/persuasion before they are in IQ.
I think you can model empathy as “next token prediction” across a set of emotional states and responses, and that could end up being easier for Transformers than the abstract logical thinking required for IQ 200.
I think "what do I mean by empathy and what will I use it for" are the key points to nail down before creating something that just needs to print "wow, that sucks" or "I told you that bitch crazy". I'd expect this kind of token prediction to be an alternative to certain types of maintenance therapy, and to fit on a watch in the next few years.
The problem with wanting e.g. an "empathetic salesperson" is that your successful role models don't work for shitty companies selling shitty products.
It’s not what we would have predicted pre-GPT, but I think it’s plausible that LLMs will be superhuman in empathy/persuasion before they are in IQ.
I think you can model empathy as “next token prediction” across a set of emotional states and responses, and that could end up being easier for Transformers than the abstract logical thinking required for IQ 200.