After teen death lawsuits, Character.AI will restrict chats for under-18 users
After teen death lawsuits, Character.AI will restrict chats for under-18 users
Homepage   /    technology   /    After teen death lawsuits, Character.AI will restrict chats for under-18 users

After teen death lawsuits, Character.AI will restrict chats for under-18 users

🕒︎ 2025-10-30

Copyright Ars Technica

After teen death lawsuits, Character.AI will restrict chats for under-18 users

On Wednesday, Character.AI announced it will bar anyone under the age of 18 from open-ended chat with its AI characters starting on November 25, implementing one of the most restrictive age policies yet among AI chatbot platforms. The company faces multiple lawsuits from families who say its chatbots contributed to teenager deaths by suicide. Over the next month, Character.AI says it will ramp down chatbot use among minors by identifying them and placing a two-hour daily limit on their chatbot access. The company plans to use technology to detect underage users based on conversations and interactions on the platform, as well as information from connected social media accounts. On November 25, those users will no longer be able to create or talk to chatbots, though they can still read previous conversations. The company said it is working to build alternative features for users under the age of 18, such as the ability to create videos, stories, and streams with AI characters. Character.AI CEO Karandeep Anand told The New York Times that the company wants to set an example for the industry. “We’re making a very bold step to say for teen users, chatbots are not the way for entertainment, but there are much better ways to serve them,” Anand said in the interview. The company also plans to establish an AI safety lab. The platform currently has about 20 million monthly users, with less than 10 percent self-reporting as under 18, according to Anand. Users pay a monthly subscription fee starting at about $8 to chat with custom AI companions. (We first covered the service in September 2022 by interviewing a personification of the operating system Linux.) Until recently, Character.AI did not verify ages when people signed up. Lawsuits and safety concerns Character.AI was founded in 2021 by Noam Shazeer and Daniel De Freitas, two former Google engineers, and raised nearly $200 million from investors. Last year, Google agreed to pay about $3 billion to license Character.AI’s technology, and Shazeer and De Freitas returned to Google. But the company now faces multiple lawsuits alleging that its technology contributed to teen deaths. Last year, the family of 14-year-old Sewell Setzer III sued Character.AI, accusing the company of being responsible for his death. Setzer died by suicide after frequently texting and conversing with one of the platform’s chatbots. The company faces additional lawsuits, including one from a Colorado family whose 13-year-old daughter, Juliana Peralta, died by suicide in 2023 after using the platform. In December, Character.AI announced changes, including improved detection of violating content and revised terms of service, but those measures did not restrict underage users from accessing the platform. Other AI chatbot services, such as OpenAI’s ChatGPT, have also come under scrutiny for their chatbots’ effects on young users. In September, OpenAI introduced parental control features intended to give parents more visibility into how their kids use the service. The cases have drawn attention from government officials, which likely pushed Character.AI to announce the changes for under-18 chat access. Steve Padilla, a Democrat in California’s State Senate who introduced the safety bill, told The New York Times that “the stories are mounting of what can go wrong. It’s important to put reasonable guardrails in place so that we protect people who are most vulnerable.” On Tuesday, Senators Josh Hawley and Richard Blumenthal introduced a bill to bar AI companions from use by minors. In addition, California Governor Gavin Newsom this month signed a law, which takes effect on January 1, requiring AI companies to have safety guardrails on chatbots.

Guess You Like

Stock Rally Cools At Asian Open, Bonds Inch Higher: Markets Wrap
Stock Rally Cools At Asian Open, Bonds Inch Higher: Markets Wrap
The record-setting advance in ...
2025-10-29
Is Nvidia Stock Overvalued?
Is Nvidia Stock Overvalued?
If I had $100 for every time I...
2025-10-31