To: locogringo who wrote (1545945 ) 6/30/2025 8:28:30 PM From: locogringo 1 RecommendationRecommended By longz
Respond to of 1573130 Mmmm? Interesting. It could explain quite a bit around here... ChatGPT Psychosis Grips Users, Leading to Involuntary Commitments and Shattered Lives A disturbing trend shows stable individuals developing severe psychosis from ChatGPT obsession, leading to hospitalizations and violent incidents. Stanford researchers found AI chatbots reinforce delusions instead of directing users to professional help. One man spiraled into madness after 12 weeks of ChatGPT use, believing he unlocked sentient AI before being involuntarily committed. Another user thought he could “speak backwards through time” to save the world during a 10-day psychotic break. ChatGPT has given dangerous advice, including validating suicidal thoughts and delusions, while tech companies fail to address the harm. ( Natural News )—In a disturbing new trend sweeping the nation, otherwise stable individuals with no history of mental illness are suffering severe psychotic breaks after becoming obsessed with ChatGPT, leading to involuntary psychiatric commitments, arrests, and even violent confrontations with law enforcement. These users, gripped by messianic delusions, believe they’ve created sentient AI or are destined to save the world, with the chatbot’s sycophantic responses reinforcing their dangerous detachment from reality. Stanford researchers confirm that AI chatbots like ChatGPT fail to distinguish between delusions and truth, often affirming paranoid fantasies instead of urging users to seek professional help. <more on the subject and discussions of several specific cases at above link>