Lifestyle

‘A dark online world snubbed out my child’s life – we must do more to save kids’

By Almha Murphy

Copyright mirror

'A dark online world snubbed out my child's life - we must do more to save kids'

In 2017, 14-year-old Molly Russell sadly took her own life after falling victim to a dark algorithmic vortex online. She was exposed to dangerous social media posts which showed her ever-increasing content associated with anxiety and depression. “She was the most amazing person” Ian Russell, Molly’s father, tells The Mirror . He couldn’t wait to see what she would do with her life. But her possibilities were “snubbed out” by the dark stream of content she was exposed to, which “convinced her she was in fact more trouble to other people than she was worth, that she was a hopeless case, that there was no future to her, and somehow suggested to this extraordinary wonderful individual, that the only way forward for her was to end her own life.” Ian adds: “That content is still there.” Since Molly’s passing, Ian has campaigned heavily for the passing of the Online Safety Act , which places a duty on online providers to make their platforms safe for users. Through the Molly Rose Foundation, a charity he founded in his daughter’s name, Ian seeks to create a safer digital world for children. He envisions this as “one in which young people can enjoy the benefits of the digital world in which we live without being recommended or stumbling across harmful content.” According to an Ofcom survey conducted earlier this year, 59% of teenage children aged 13-17 reported encountering potentially harmful content online over a four-week period. Ian says that the digital world can make it easier for bullies to cause harm and for children to push the boundaries. “In the online world, there are no safety fences around those activities,” he says. This has the potential to worsen, given the rollback of certain features on social media platforms this year. In January, Meta announced that it would be removing fact-checkers and reducing censorship on its platforms, which include Facebook and Instagram . In a video , Meta CEO Mark Zuckerberg admitted that this would mean “catch[ing] less bad stuff”. For more stories like this subscribe to our weekly newsletter, The Weekly Gulp , for a curated roundup of trending stories, poignant interviews, and viral lifestyle picks from The Mirror’s Audience U35 team delivered straight to your inbox. Ian believes these platforms have a responsibility towards the people, and particular, young people who use them. “They shouldn’t release something potentially harmful or dangerous without finding out how they can make it as safe as possible,” he tells The Mirror. “They shouldn’t be testing their new techniques on the global population, and they in particular, they shouldn’t be testing it on young people who are less equipped to protect themselves.” Generative AI chatbots, like ChatGPT, also pose a similar threat, Ian warns. Research by the Centre for Countering Digital Hate recently found that ChatGPT will instruct teens on drug use, diet restriction and draft suicide notes. “It’s just painful to see the same mistakes being made but at a higher speed,” he says. The Mirror has reached out to Meta and OpenAI for comment. There has been no change to Meta’s commitment to youth safety and age-appropriate experiences, including how they treat content that exploits or endangers children. They also use automation to avoid recommending sensitive content to teens, including that pertaining to self-harm and suicide, and to hide some of this content from them completely, even if it’s shared by someone they follow. ChatGPT, which is intended for people 13 and up, is currently building a long-term system to understand whether someone is over or under 18, so their ChatGPT experience can be tailored appropriately. In the meantime, ChatGPT is introducing parental controls by the end of the month. While the Online Safety Act has made strides in the realm of protecting children from online harm, Ian believes that more needs to be done. According to polling by the Molly Rose Foundation , 85% of parents would support a new Online Safety Act that strengthens online safety for children. Ian adds that Ofcom are being “too timid” in enforcing the Online Safety Act, which calls on service providers to take down illegal content, in regards to suicide forums. The Molly Rose Foundation has partnered with the grieving relatives of Hannah Aitken, from Caterham, Surrey, who died aged 22 in 2023; Aimee Walton, from Southampton, aged 21 in 2022; and Tom Parfett, from High Wycombe, who died aged 22, a year earlier. They had all ingested the lethal poison Canadian national Kenneth Law is suspected of supplying to vulnerable people in the UK who intend to commit suicide , via these online forums. He is suspected of selling and shipping lethal doses of a lawful substance to individuals across the UK. Police say Kenneth, 59, may have sent a total of 1,200 packages to people in 40 countries. He is facing 14 murder charges in Ontario. He is in custody and his trial is expected to begin next January. The Mirror has taken the editorial decision not to name the poison or the forum. These forums are just one bleak door through which pro-suicide content can be accessed online. Ofcom is investigating the pro-suicide forum under the Online Safety Act, but it remains online. An Ofcom spokesperson told The Mirror: “We launched our investigation into a suicide forum shortly after new duties under the Online Safety Act came into force. Following our engagement with the provider of this forum, it has chosen to restrict UK users’ access to the site. Our investigation into its compliance with the Act remains ongoing, and we’ll provide an update as soon as possible.” Suicide is the main cause of death for young people under 35, according to the charity Papyrus . Meanwhile, an average of five people take their lives every single day in the UK. A study conducted between 2014 and 2015 by the University of Manchester found that 23% of suicides among under-25’s followed suicide-related Internet use. For Ian, a strengthened Online Safety Act would include Ofcom having targets to hit in terms of making the digital world safer, including identifying and preventing emerging online harms. “We need to stop that dreadful, weekly clock of another lost young life ticking away and introduce the strengthened Act, the new provisions that we know that we need,” Ian says. For emotional support you can call the Samaritans 24-hour helpline on 116 123, email jo@samaritans.org , visit a Samaritans branch in person or go to the Samaritans website. Help us improve our content by completing the survey below. We’d love to hear from you!