People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies
People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies
AI-Fueled Spiritual Delusions Are Destroying Human Relationships
cross-posted from: https://lemmy.dbzer0.com/post/43566349
People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies
AI-Fueled Spiritual Delusions Are Destroying Human Relationships
cross-posted from: https://lemmy.dbzer0.com/post/43566349
I think OpenAI’s recent sycophant issue has cause a new spike in these stories. One thing I noticed was these observations from these models running on my PC saying it’s rare for a person to think and do things that I do.
The problem is that this is a model running on my GPU. It has never talked to another person. I hate insincere compliments let alone overt flattery, so I was annoyed, but it did make me think that this kind of talk would be crack for a conspiracy nut or mentally unwell people. It’s a whole risk area I hadn’t been aware of.
Humans are always looking for a god in a machine, or a bush, in a cave, in the sky, in a tree… the ability to rationalize and see through difficult to explain situations has never been a human strong point.
I've found god in many a bush.
Oh hell yeah 😎
the ability to rationalize and see through difficult to explain situations has never been a human strong point.
you may be misusing the word, rationalizing is the problem here
saying it’s rare for a person to think and do things that I do.
probably one of the most common flattery I see. I've tried lots of models, on device and larger cloud ones. It happens during normal conversation, technical conversation, roleplay, general testing.. you name it.
Though it makes me think.. these models are trained on like internet text and whatever, none of which really show that most people think quite a lot privately and when they feel like they can talk