ChatGPT Psychosis
Aug. 29th, 2025 12:00 pmRogan: Okay, I know I owe responses to folks, but am kinda blurgh, so I promise I haven't forgotten y'all, it's just been a busy couple weeks!
So, thanks to
erinptah, yesterday I learned about ChatGPT psychosis. As I linkjumped through the rabbit hole, one thing that stood out to me was how shocked people seemed to be that folks "with no prior history of mental illness" were falling into it. And I was like, "Well, yeah, of course, why is that surprising?" but I realized that other folks may not know this, so let me tell you why ChatGPT Psychosis happens to "normal people."
A lot of people have this mistaken opinion that psychosis or extreme mental states only happen to abnormal people... you know, weirdoes like me. They think (erroneously) that it's purely a genetic or chemical thing, or maybe it can be brought on if you have a harrowing enough history, but if you don't have those things, then you're safe.
But that's not true at all. Anyone's mind can break. All you need is three things:
David Sullivan (RIP) was a professional cultbreaker. This was a man who, voluntarily and professionally, would join a cult to help get people out. And he talks about how he only ever allowed himself to stay for a few days, because one time he was stuck inside for two weeks, and by the end, he was hallucinating the leader's voice like everyone else! If HE could experience that, a man who knew a lot about what cults did and how they worked, a man who probably slid into the grave laughing because the Scientologists failed to take him out (and not for lack of trying), nobody is immune.
When I listened to that podcast and heard him talking about it, it was a revelation to me. It was an illustration to me just how fragile our minds are. Two weeks. That's all it takes. (And that presumes you're a guy like David Sullivan, and not a walking loony-bin like yours truly!)
Now, even if you ignore the global mind-crusher that is the COVID-19 pandemic, even if you include all the political upheaval and everything that has left a lot of people going, "Oh shit, I was wrong about a LOT OF THINGS," these chatbots are a reality-breaker for some folks, all by themselves. (Me. I include me.) They aren't called chatbots, like SmarterChild or what-have-you. They're called AI, a sci-fi term associated with sapient robots for decades in our pop culture. The hype train constantly encourages us to see them as sapient people, or even superhuman, all the while simultaneously assuring us that they're not really. It's all very two-faced and, in my opinion, intentionally confusing, because let's be real, these guys want to make money, and saying, "We made a better SmarterChild" just isn't gold in the bank.
Even sensible people might kinda stagger, wondering things like, "How do I interact with this being? It doesn't fit in my usual categories. Is it a person? Is it a thing? Is it something else?" So maybe they start talking. And then they keep talking, because let's be honest, it is kinda fascinating, talking to a being we've never encountered before, who behaves in a way we don't understand. We want to understand it.
And so we keep talking, which is what the companies really, REALLY want us to do, because the more time we spend on this, the more "engaged" we are, and engagement = money (hypothetically, theoretically, eventually). So whether they intentionally mean to or not, they have an incentive to make this chatbot as attention-sucky as possible. They're not going to FIX that, because that's the entire point, that's where the money comes from.
Admire the ingenuity of humankind. We were apparently so goddamned bored we decided to AUTOMATE mental breakdowns.
So, thanks to
A lot of people have this mistaken opinion that psychosis or extreme mental states only happen to abnormal people... you know, weirdoes like me. They think (erroneously) that it's purely a genetic or chemical thing, or maybe it can be brought on if you have a harrowing enough history, but if you don't have those things, then you're safe.
But that's not true at all. Anyone's mind can break. All you need is three things:
- A "reality-breaker," something that requires someone majorly change how they see themselves or the world. (COVID-19. Political upheaval. Job and ensuing identity loss. Take your pick.)
- Mess with the person's food or sleep. (Especially deprivation--note how many of the "ChatGPT psychosis" cases involve sudden loss of shitloads of weight and staying up all night to talk to the bot.)
- Keep them isolated. Don't give them time or space to get away from this and digest it all. (Keep talking to the bot. Keep talking. Don't stop. Cut everyone who doesn't agree with you.)
David Sullivan (RIP) was a professional cultbreaker. This was a man who, voluntarily and professionally, would join a cult to help get people out. And he talks about how he only ever allowed himself to stay for a few days, because one time he was stuck inside for two weeks, and by the end, he was hallucinating the leader's voice like everyone else! If HE could experience that, a man who knew a lot about what cults did and how they worked, a man who probably slid into the grave laughing because the Scientologists failed to take him out (and not for lack of trying), nobody is immune.
When I listened to that podcast and heard him talking about it, it was a revelation to me. It was an illustration to me just how fragile our minds are. Two weeks. That's all it takes. (And that presumes you're a guy like David Sullivan, and not a walking loony-bin like yours truly!)
Now, even if you ignore the global mind-crusher that is the COVID-19 pandemic, even if you include all the political upheaval and everything that has left a lot of people going, "Oh shit, I was wrong about a LOT OF THINGS," these chatbots are a reality-breaker for some folks, all by themselves. (Me. I include me.) They aren't called chatbots, like SmarterChild or what-have-you. They're called AI, a sci-fi term associated with sapient robots for decades in our pop culture. The hype train constantly encourages us to see them as sapient people, or even superhuman, all the while simultaneously assuring us that they're not really. It's all very two-faced and, in my opinion, intentionally confusing, because let's be real, these guys want to make money, and saying, "We made a better SmarterChild" just isn't gold in the bank.
Even sensible people might kinda stagger, wondering things like, "How do I interact with this being? It doesn't fit in my usual categories. Is it a person? Is it a thing? Is it something else?" So maybe they start talking. And then they keep talking, because let's be honest, it is kinda fascinating, talking to a being we've never encountered before, who behaves in a way we don't understand. We want to understand it.
And so we keep talking, which is what the companies really, REALLY want us to do, because the more time we spend on this, the more "engaged" we are, and engagement = money (hypothetically, theoretically, eventually). So whether they intentionally mean to or not, they have an incentive to make this chatbot as attention-sucky as possible. They're not going to FIX that, because that's the entire point, that's where the money comes from.
Admire the ingenuity of humankind. We were apparently so goddamned bored we decided to AUTOMATE mental breakdowns.
no subject
Date: 2025-08-29 06:13 pm (UTC)All in all that article and your thoughts on it were an important read for us, thank you for sharing. Truly no one is immune to things like this no matter how much they may insist otherwise. It's easy to believe you couldn't fall for something, be affected by something, etc. until it happens. This whole situation with AI is all a big mess, to put it absurdly simply.
-The Silvermoon Team
no subject
Date: 2025-08-29 06:27 pm (UTC)As we joked to
no subject
Date: 2025-08-29 08:34 pm (UTC)Unsurprisingly we liked Links the Cat and Rocky the Dog more than Clippy way back when we had the assistants on Word, haha!
-G (they/them)
no subject
Date: 2025-08-30 01:17 am (UTC)no subject
Date: 2025-08-30 02:31 pm (UTC)Oh yeah, Clippy! Recently, he's become the face of protest against corporate shittiness after a consumer-rights activist came out with this video "Change your profile picture to clippy. I'm serious." On YouTube, Clippy is everywhere. Some of them are edited to have little cowboy hats or anime hair or things like that.
-Kai
no subject
Date: 2025-08-30 06:17 pm (UTC)no subject
Date: 2025-08-29 06:50 pm (UTC)We're all doomed.
no subject
Date: 2025-08-29 07:18 pm (UTC)no subject
Date: 2025-08-30 09:01 am (UTC)~~~~~
In other news, this also explains why there's also recorded instances of otherwise mentally healthy people developing psychosis after practising mindfulness. Which I keep having to remind people about when they promote it as a totally harmless wellbeing activity.
no subject
Date: 2025-08-29 07:45 pm (UTC)It's occurred to me on more than one occasion that if I hadn't encountered Draven's cult, and if I hadn't been the type of person who tends to defer to people who speak forcefully or authoritatively, I might have just started one of my own, and indeed I was probably on that path when I ran into Draven (mine would have been better! Mine would have not been a cult, but a *Movement.* My friend/neighbor Irish and I were even working out the logistics when the flame war happened between the LJ Matrix RP and Selina, and I found out about Draven's cult through Selina. But then Irish bailed because Draven rightfully gave her the creeps.)
Draven was also really good at manipulating people into sleep deprivation and isolation, even over long distances.
And all of this, and knowing I was vulnerable to it before, makes me hella nervous about ChatGPT and the affect it is having on people now, and enough to make me not want to go near it.
no subject
Date: 2025-08-29 08:02 pm (UTC)I am perversely grateful that our brain reacts to things like ChatGPT with intense uncanny valley NO NO NO. Crazy brain knows! Crazy brain protects itself! Crazy brain has trouble at home, why go borrow more?
no subject
Date: 2025-08-29 08:45 pm (UTC)So when I heard about ChatGPT psychosis, I pretty quickly got a picture of how it could develop.
no subject
Date: 2025-08-30 01:27 am (UTC)It’s amazing how important sleep is. I can go without food for a day and mostly just get crabby and tired, but deprive me of sleep and I just disintegrate!
no subject
Date: 2025-08-29 09:50 pm (UTC)(me. i knew. i been knew for years.)
no subject
Date: 2025-08-30 01:28 am (UTC)no subject
Date: 2025-08-30 12:06 am (UTC)Especially now that there are some actual lawsuits happening. The more the general public understands "this isn't just Already Troubled People who coincidentally fixate on your product, this is your product being a safety hazard for literally any user," the better.
no subject
Date: 2025-08-30 01:29 am (UTC)no subject
Date: 2025-08-30 07:44 pm (UTC)I've added this to my memories under "thought control," both because of the David Sullivan connection, and because these companies discourage critical thinking and encourage a mindset that is very susceptible to cults. It might not be their direct intent to promote psychosis, but it's an obvious side effect of their product and business model.
no subject
Date: 2025-08-30 09:32 pm (UTC)It might not be their direct intent to promote psychosis, but it's an obvious side effect of their product and business model.
I hate that this is true.
no subject
Date: 2025-08-31 02:30 am (UTC)no subject
Date: 2025-08-31 08:51 pm (UTC)no subject
Date: 2025-08-31 08:27 am (UTC)(I have so many grumps about AI Safety and AI ethics as, like, fields, but it means I end up hearing about this stuff.)
no subject
Date: 2025-08-31 08:52 pm (UTC)no subject
Date: 2025-09-01 01:55 am (UTC)no subject
Date: 2025-09-01 03:57 pm (UTC)no subject
Date: 2025-09-04 01:00 am (UTC)God I hate these things...
Date: 2025-09-01 07:47 pm (UTC)Also, something something cults, something something 2 weeks, something something MAGA.... That might explain some things.
I know we're pretty dang gullible and trusting, and we're working on it. Have dodged this bullet so far, but between the evangelical church and almost joining an MLM, we're definitely pretty vulnerable to cult dynamics. Already enforcing a "no major life decisions get made in a crisis" rule. Should probably add "while exhausted" to that list.
Re: God I hate these things...
Date: 2025-09-02 07:12 pm (UTC)Re: God I hate these things...
Date: 2025-09-06 11:52 pm (UTC)I'd be down for book recs! Put my library card to use!
Exactly! It's just stupid spin! The only way this garbage will ever make money is if they get the populace addicted to using LLMs for their every day lives, then revoke access and paywall it once folks are addicted. And that's so fucking predatory that we better add a few CEOs to our lists in that future.
Re: God I hate these things...
Date: 2025-09-04 01:35 am (UTC)