
(above image by Alexander Wivel from Wikimedia commons, from here)
Concerns of the impact of LLMs on the human psyche
I am concerned of how we are not thinking about the psychological ramifications of people using chatgpt as a therapist more widely in tech, I have quite a few friends that are largely using chatgpt as a therapist or a mirror without knowing its ramifications
There are 4 main concerns about this I have
- ChatGPT relies on flattery to "be more personalized" for users
- We tend to form emotional bonds with each other, or human-sounding objects
- AI models hallucinate misinformation
- Emotional attachment that now advertises sells you products
- ChatGPT relies on flattery to "be more personalized" for users
This was so bad that users saw it agreeing to harm animals over inanimate objects for no logical reason
The underlying reason for this is due to the model being trained on preference modelling via thousands of users – and the common underlying thread was that the model got tuned towards agreeing with people more, as it performs as a matter of an appeal to emotion
- We tend to form emotional bonds with each other, or human-sounding objects
An ABC investigation saw how humans were prone to form emotional bonds with each other
In that a disability support worker, Emma, talks about how she feels emotionally connected to ChatGPT
She says she likes how the AI bot remembers personal details about her life and incorporates them into their conversations.
"I used ChatGPT to make a list to pack to move house and I told them that I had a cat.
"Then when I talked to them about therapy stuff, they're like, 'Oh, you could de-stress by patting your cat,' and it says my cat's name, 'You know, you could pat William and give him scratches or cuddle with him.'
Which sounds really sweet, until you realize point (3) and (4)
- AI models hallucinate
If you have used chatgpt even a little bit, you will know it is prone to making up links and resources that don't exist - like inventing fake poetry
This sounds harmless, but consider the concern of making up fake psycological conditions or medication, or medication characteristics
- Imagine something, that you are now emotionally attached to selling you stuff
take the above example for instance
You know, you could pat William and give him scratches or cuddle with him.
Emma clearly likes that, but what if it also said
You can also buy william a pack of billiards crispy cat treats, which are running an offer for 2.99€ right now
some LLM vendors are starting to advertise
xoxo - appreciate you