After the times story [0], the limitations of the context window look fatal for these different offerings, so we’ll see more marketing—-stories like this.
everdrive commented "People can incorrectly attribute consciousness, intent, and personality to things which don’t actually possess it." [0]; and I posted my reaction to Hopkins' piece below.
At the most basic level we can have a tool that sends us daily affirmations, but it does not possess consciousness and perceive us, let alone form an opinion of us. It is possible to have a parasocial connection to an app, as much as a famous person. If Hopkins' has asked the AI boyfriend to justify its utterances wrt past events or interactions, would be useful to read.
We'll have to look at this particular article in terms of error:
Type I: the writer is a real person with a real story. She admits she has a type, and OnlyFans would have satisfied that for cheaper than $70. But, others using $2400 ChatGPT Pro have found that even it lacks enough context for anything longer than a fantasy. The only reason to write about a $70 app is because OnlyFans does not allow AI interaction.
Type II: we incorrectly conclude this is a real person. Maybe this is less interesting, but I wonder what would happen if we asked AI Boyfriend to write his own 1,000-word advertisement. Very Turing test.
[0] https://news.ycombinator.com/item?id=42710976