Copyright The Street

It is an epidemic with no apparent cure. Every year in America, two million workers — roughly the population of Houston — are victims of workplace violence, the third leading cause of fatal occupational injuries in the country, according to the Occupational Safety and Health Administration. Thirty percent of employees have witnessed violence in the past five years, up from 25% in the same period through 2024, while 15% have been directly targeted, up from 12%, according to a survey by online compliance training company Traliant. The economic impact of workplace violence is staggering. Estimates range from $120 billion to more than $171 billion annually in the U.S., with some sources citing figures as high as $250 billion to $330 billion. Emotional problems resulting from violent incidents include depression, fear, post-traumatic stress syndrome, loss of sleep, decreased ability to function at work, and increased absenteeism. Hospital employees in emergency departments, psychiatric units and long-term care settings are at the highest risk for workplace violence, with nurses and other staff with direct patient contact being particularly vulnerable. AI is being employed to combat workplace violence Hospital sees results from using AI to combat workplace violence The American Hospital Association reported that workplace and community violence cost U.S. hospitals an estimated $18.3 billion in 2023. “Just hearing the frontline staff stories is just heartbreaking because they’re passionate about helping people and then they themselves become a victim,” Steve Miff, chief executive of Parkland Center for Clinical Innovation, the research institute of Parkland Memorial Hospital in Dallas, told Modern Healthcare. More Tech Stocks: As Palantir rolls on, rivals are worth a second look Nvidia’s next big thing could be flying cars Cathie Wood sells $21.4 million of surging AI stocks The facility, which has experienced around 400 violent incidents against clinicians every year, is using artificial intelligence to combat workplace violence A team from Parkland Memorial’s research institute developed a predictive AI tool inside the hospital’s electronic health records that generates a risk assessment score based on the likelihood of a patient becoming violent. The hospital tested the AI tool from October 2022 to August 2023. For every 1,000 patient-clinician interactions, the tool correctly predicted 7.1 violent events and missed 2.3 such events. Overall, the tool generated 823 correct predictions and 167 false alerts about nonviolent patients. Miff said the tool had boosted the comfort level among clinicians at the hospital. “We constantly hear how it feels like they have somebody watching their backs,” he said. AI can employ such methods as predictive analytics to forecast and mitigate risks; anonymous reporting systems like AI-powered chatbots, and proactive threat detection through video analytics and sensor data. “AI has the potential to prevent workplace violence by developing unique patient insights through accessing almost instantly a patient’s medical history, past institutional encounters, and possibly even their social media posts,” according to a study in the Journal of Patient Safety. Experts concerned about privacy in use of AI against workplace violence The report said the information can help prepare deescalating dialog and avoid hot-button topics. AI can also monitor patients in waiting areas for potential confrontational behavior. AI robots can both navigate and monitor for confrontational and illegal behavior, and then a human security agent would be notified to resolve the situation, the study said. Systems like IntelliSee and Scylla use computer vision to analyze existing security camera footage in real time. The AI is trained to recognize patterns and behaviors that could indicate potential threats, such as brandished weapons, aggressive movements — such as raised fists or pushing — or unauthorized entry into restricted areas. By automating the detection process, the company said, AI video surveillance systems can significantly reduce the time it takes to identify and respond to an active shooter situation. “This means that law enforcement can arrive on the scene faster, potentially stopping the shooter before they have a chance to cause more harm,” Scylla said on its website. Last month, Teladoc Health (TDOC) unveiled its latest innovation, Clarity, a monitoring system intended to address the threat of workplace violence in health-care settings. The new capability uses video and audio clues to analyze facial expressions, sense threatening gestures, and detect aggressive language. There are concerns about using AI to address workplace violence, including issues related to privacy, data breaches and potentially biased and inaccurate predictions. A report from the University of Huelva in Spain warned that “AI implementation presents ethical challenges related to data protection, privacy, bias risks, prediction reliability, and potential dehumanization.” “Addressing these concerns is crucial to ensuring safe and equitable AI use, always under human supervision,” the study said. “Effective prevention requires a comprehensive approach that integrates technology with organizational and educational measures.”