Copyright channelnewsasia

SAN FRANCISCO: Startup Character.AI announced Wednesday (Oct 29) it would eliminate chat capabilities for users under 18, a policy shift that follows the suicide of a 14-year-old who had become emotionally attached to one of its AI chatbots. The company said it would transition younger users to alternative creative features, such as video, story, and stream creation with AI characters, while maintaining a complete ban on direct conversations that will start on November 25. The platform will implement daily chat time limits of two hours for underage users during the transition period, with restrictions tightening progressively until the November deadline. "These are extraordinary steps for our company, and ones that, in many respects, are more conservative than our peers," Character.AI said in a statement. "But we believe they are the right thing to do." The Character.AI platform allows users - many of them young people - to interact with beloved characters as friends or to form romantic relationships with them. Sewell Setzer III shot himself in February after months of intimate exchanges with a "Game of Thrones"-inspired chatbot based on the character Daenerys Targaryen, according to a lawsuit filed by his mother, Megan Garcia. Character.AI cited "recent news reports raising questions" from regulators and safety experts about content exposure and the broader impact of open-ended AI interactions on teenagers as driving factors behind its decision. Setzer's case was the first in a series of reported suicides linked to AI chatbots that emerged this year, prompting ChatGPT-maker OpenAI and other artificial intelligence companies to face scrutiny over child safety.