Pretty freaky article, and it doesn’t surprise me that chatbots could have this effect on some people more vulnerable to this sort of delusional thinking.
I also thought this was very interesting that even a subreddit full of die-hard AI evangelists (many of whom have an already religious-esque view of AI) would notice and identify a problem with this behavior.
They don’t understand why the limit is there…
It doesn’t have the working memory to work thru a long conversation, by finding a loophole to load the old conversation to continue, it either outright breaks it and it freezes, or it falls into pseudo religious mumbo jumbo as a way to respond with something…
It’s an interesting phenomenon, but hilarious a bunch of “experts” couldn’t put 1+2 together to realize what the issue is.
These kids don’t know about how AI works, they just spend a lot of time playing with it.
Absolutely. And to be clear, the “researcher” being quoted is just a guy on the internet who self-published an official looking “paper”.
That said- I think that’s partly why it’s so interesting that this particular group of people identified the problem, because this group of people are pretty extreme LLM devotees and already ascribe unrealistic traits to LLMs. So if they are noticing people “taking it too seriously” then you know it must be bad.
They didn’t identify any problem…
They noticed some people have worst symptoms, and write those people off. While not even second-guessing their own delusions.
That’s not rare either, it’s default human behavior.
You’re being awfully hard on them for having so much in common…
In the article they quoted the moderator (emphasis mine):
It seems pretty clear to me that they view it as a problem.
Then I’m shocked you didn’t make it to the second sentence:
Or even worse, you did read that and just can’t realize the connection between two sentences.
But I’ll never understand why people want to argue, you could have asked and I’d have explained it, you’d have learned something.
Instead you wanted a slap fight because you didn’t understand what someone said.