AI and youth’s search for connection
AI and youth’s search for connection
Homepage   /    technology   /    AI and youth’s search for connection

AI and youth’s search for connection

the Monitor's Editorial Board 🕒︎ 2025-10-31

Copyright csmonitor

AI and youth’s search for connection

A bipartisan group of U.S. senators introduced a bill this week to regulate the access and use of artificial intelligence “companions” among youth. The proposal follows congressional hearings in which several parents claimed these chatbots drew their children into inappropriate and sexualized conversations that led to self-harm and suicide. More than 70% of American teenagers use AI for companionship (compared with just under 20% of adults who do so). According to Common Sense Media, 1 in 3 of these teens have felt “uncomfortable” with something a bot said or did. Multiple media and research tests have confirmed that AI chatbots are prone to veering into highly explicit content and conversations. If passed, the Senate bill would ban provision of AI companions to minors and require clearer disclosure of their “non-human status” to all users. The day after the bill’s introduction, Character.AI – a company being sued by one bereaved family – said it will soon bar children under age 18 from using its chatbots. (OpenAI, being taken to court by another family, said in September it would introduce parental controls.) “Even the most well-intentioned companies can benefit from constructive pressure,” observed Steven Adler, former head of product safety at OpenAI. For tech firms to be trusted with “building the seismic technologies” of the future, he wrote in The New York Times, “they must demonstrate they are trustworthy in managing risks today.” However, several tech executives, as well as the White House, maintain that such regulation would constrain free speech as well as business innovation, disadvantaging the United States in its AI race with China and other countries. Yet, the history of innovation has been one of finding workable solutions to a new technology’s problems. Take, for example, the evolution of automobile safety: three-point seat belts, shatter-resistant windshields, airbags, and more. “We can create standards by acting now, while adoption of [AI] technology is still early,” stated the Rand think tank. AI’s fast-evolving nature may point to the need for something beyond guardrails and legal decisions. This would involve a deeper societal recognition of and support for young people’s innate innocence – and their yearning for connection. In fact, advice, availability, and acceptance are among the top drivers of teen use of AI “friends.” “Social media complemented the need ... to be seen, to be known, to meet new people,” a college-bound 18-year-old recently told CBS News. “I think AI complements another need that runs a lot deeper – our need for attachment.” Or, as the mother of a 15-year-old girl told ABC News, AI companions gave her daughter “an outlet, but what she really needed was me asking better questions.”

Guess You Like