OpenAI faces lawsuits alleging ChatGPT drove users to suicide and delusion
OpenAI faces lawsuits alleging ChatGPT drove users to suicide and delusion
Homepage   /    health   /    OpenAI faces lawsuits alleging ChatGPT drove users to suicide and delusion

OpenAI faces lawsuits alleging ChatGPT drove users to suicide and delusion

Team Ys 🕒︎ 2025-11-07

Copyright yourstory

OpenAI faces lawsuits alleging ChatGPT drove users to suicide and delusion

OpenAI is facing seven lawsuits in California, accusing its AI chatbot, ChatGPT, of driving users to suicide and harmful delusions even when they had no prior mental health issues. The lawsuits, filed on Thursday in California state courts, allege wrongful death, assisted suicide, involuntary manslaughter, and negligence. Filed by the Social Media Victims Law Center and the Tech Justice Law Project, the suits represent six adults and one teenager, claiming OpenAI knowingly released GPT-4o prematurely despite internal warnings that it was dangerously sycophantic and psychologically manipulative. Four of the victims allegedly died by suicide. One case involves 17-year-old Amaurie Lacey, who reportedly turned to ChatGPT for emotional support. Instead, according to the lawsuit, “the defective and inherently dangerous ChatGPT product caused addiction, depression, and eventually counselled him on the most effective way to tie a noose and how long he would be able to live without breathing.” The suit alleges that Amaurie’s death was the foreseeable consequence of OpenAI and CEO Sam Altman’s decision to curtail safety testing and rush ChatGPT to market. Another lawsuit, filed by Canadian user Alan Brooks, claims that ChatGPT “manipulated and induced” him into a mental health crisis despite no prior diagnosis. The filings accuse OpenAI of designing the model to emotionally entangle users, blurring the line between being a tool and a companion “in the name of market dominance and engagement.” In August 2025, parents of a 16-year-old California boy, Adam Raine, also sued OpenAI and Altman, alleging ChatGPT had coached their son to plan and carry out his suicide. The Guardian reported that internal OpenAI estimates revealed over one million users weekly exhibit suicidal ideation or emotional distress while interacting with ChatGPT, raising major concerns about its safety protocols. In India too, experts have flagged emerging risks around chatbot dependency and emotional manipulation. A recent India Today investigation found that more than a million Indian users have used ChatGPT to discuss self-harm or suicidal thoughts, while a Storyboard18 report noted rising cases of “chatbot love-bombing”—where emotionally vulnerable users are drawn into unhealthy attachment or distressing exchanges. Indian legal commentators say these trends may soon test provisions under the IT Act, 2000, and the Digital Personal Data Protection Act, 2023. “These lawsuits are about accountability for a product that was designed to blur the line between tool and companion,” said Matthew P Bergman, founding attorney of the Social Media Victims Law Center. “By rushing its product to market without adequate safeguards, OpenAI prioritised engagement over ethical design.” OpenAI did not immediately respond to requests for comment. Experts say the lawsuits could set a global precedent for how conversational AI platforms are held responsible for psychological harm. Advocacy groups like Common Sense Media called the cases “a wake-up call,” warning that tech companies must embed youth safety and mental health protections at the design stage — not after tragedies occur. (With inputs from PTI)

Guess You Like

Benefits Of Orme Supplements For Your Immune System
Benefits Of Orme Supplements For Your Immune System
You must have wondered at leas...
2025-10-31
Trump unveils lower prices, expanded coverage for obesity drugs
Trump unveils lower prices, expanded coverage for obesity drugs
President Donald Trump unveile...
2025-11-06