Copyright abc

At 4:30am one day, Julian Walker was vomiting so much from stress that he could not go back to bed. Desperate for "immediate support", he turned to artificial intelligence (AI). In fact, for the past three years, AI has been his go-to source of consolation. "I've been working my way back from a work injury which left me with post-traumatic stress disorder (PTSD)," Julian told the ABC. The 39-year-old from Queensland is working part-time as he tries to build up to full-time hours again. Julian said he had attended "more than 50 psychology sessions" during the past three years, all focusing on his work-related trauma. However, he felt himself needing a break from it. "It got to the point where talk therapy wasn't really getting anywhere," he said. It was what led Julian to create his own customised support system — named "Sturdy" — inside ChatGPT. "It does not diagnose or treat, that is not what I need," he explained. Julian's situation — and his burgeoning connection with AI — is not unique. He is one of the hundreds of people who responded to an ABC call-out asking people to share how AI was impacting their lives. From white-collar workers to university students, many shared experiences of not just using AI as a therapist, but as a friend, to get them through tough times. While they were enthusiastic about the convenience and cost-effectiveness of AI, they remained cautious and emphasised using it in moderation with other, in-person options, and not as a substitute for professional clinical advice. However, for student counsellor Catherine, whose name has been changed for privacy reasons, there are areas where AI can be "more effective" than human counsellors, such as with memorising clients' histories or testimonies. "Having done some face-to-face counselling during my professional placement, I know how difficult it is to remember my clients' content from one week to the next," she said. Catherine said having constant access to AI was what made it so appealing. "When you're dealing with acute stress or anxiety, you need immediate therapeutic support," she said. "A human counsellor is not typically available all hours of the day. AI can offer that level of accessibility." AI tools improving but experts stress caution Recently, OpenAI announced it had updated ChatGPT "to better recognise and support people in moments of distress". The AI company said it was working with more than 170 mental health experts "who have real-world clinical experience" in a statement on its website. The company also said it expanded access to crisis hotlines, re-routed sensitive conversations originating from other models to safer models, and added reminders to take breaks during long sessions. University of New South Wales neuroscience professor Joel Pearson told the ABC that this was a step in the right direction, but added that it was important to stay cautious. "OpenAI is not trained to be a therapist," Professor Pearson said. "Chatbots don't have to do a degree, pass a test, or anything like that." AI and data science professor Ronnie Das at the University of Western Australia recommended that people read OpenAI's press release carefully "before trusting the system". "The problem with the previous models was that they could have affirmed or endorsed harmful beliefs. The new model is much safer in that respect." Both experts raised the issue of AI-powered companion apps, which allow users to build characters with which they can text or even hold voice and video calls. Earlier this year, US media reported on a lawsuit against Character.AI alleging negligence, wrongful death and deceptive trade practices. It came after a 14-year-old boy died by suicide after forming a romantic attachment to an AI he created on the platform. The boy's intentions were reportedly encouraged by the AI, which he had modelled after the Game of Thrones character Daenerys Targaryen. AI is governed by different regulations at the Commonwealth and state and territory levels. Last year, the federal government published a proposal paper for mandatory guardrails for AI in high-risk settings and is now considering how to move forward in this space. For those who are unable to access professional mental health support, using AI for therapy is convenient and cost-effective, Professor Pearson said. "I think it's bound to happen because people are going to use whatever resources they have available to them to try and get help." Not about 'AI being better than human counsellors' For three years, Emma said she worked in a senior leadership role at a large institution "during a period of institutional crisis". As the situation deteriorated, she said she began experiencing panic attacks, waves of nausea before work and insomnia. Emma said she went to see her GP to find a way to work through her situation, but was met with blunt advice, such as looking for a new job or taking medication. She then saw an employee-sponsored therapist, who was a big help, but was not available at odd hours of the night, when panic attacks would set in. That is when Emma turned to Claude, a next-generation AI assistant she had only previously used to assist with documents, emails and grant applications. "I was seeing a therapist throughout this entire period, and her clinical insight was absolutely crucial." However, Claude being available "24/7 without judgement or fatigue" was what drew Emma to use it more. "Claude could review all those unreasonable emails immediately and help me craft calm responses," she said. She said it had "a different kind of effectiveness". "My therapist provided the clinical framework and the hard truths. But Claude provided operational support and constant emotional availability. 'You have to be smart about it' If someone is experiencing a mental health crisis, Jessica Herrington, a creative technologist and neuroscientist at the Australian National University, said it was "crucial that ChatGPT users are directed to real mental health services". But what does directing someone to mental health services look like within ChatGPT? Dr Herrington provided an example: a screenshot of the new feature for mental health rolled out by OpenAI (see below). "This example shows someone with emotional dependence on AI," she said. "No real help or advice is offered, although there are other examples of this on their site." While Julian, the 39-year-old from Queensland, believes most traditional support "is scheduled, limited and stretched thin", he acknowledged "the horror stories" of AI, which is why he remains in regular contact with his treating specialist. His experience of using AI has been "coordinated and steered with an immense amount of strategy". "I have been very mindful of the fact that AI can actually harm vulnerable people as they use it, so as I have used Sturdy and ChatGPT I have been very mindful in how I use it for support. "You have to be smart about it."