Copyright Charleston Post and Courier

It’s easy to see why tech companies are rushing to find new ways to use artificial intelligence. You know, because actual intelligence is on the wane these days. We live in a world where people believe rising prices are really falling, politicians say one thing and do the exact opposite — with a straight face — and no amount of pointing this out registers with their followers. And people with infectious diseases think it’s perfectly fine to go out in public and spread it. OK, maybe that’s not a lack of intelligence, but of empathy. Or common sense. So, now AI comes along, and as many folks will recall, "Terminator 2" warned us this would not work out well for humans. Especially when the machines get some authority. Now, why would anyone think we might squander advanced technology or allow it to spiral hopelessly out of control? Uh, possibly because we carry more computing power in our pockets these days than NASA used to send men to the moon, and we mostly use it to fight with strangers on the internet and watch AI-generated videos of racoons jumping on trampolines. While everyone is distracted by such trifles, many companies are employing AI to replace employees, often in the online customer service business. Tried getting a prescription refilled at a pharmacy lately? It’s like talking to Hal from 2001: A Space Odyssey. Dave, you won’t last long without your blood thinners. Unfortunately, we’re out of stock … too bad for you. Other institutions are finding ways for AI to supplement the work of people. The Charleston County School Board on Monday discussed various ways AI could be used by teachers in the classroom. And, to the point, Isle of Palms police are using AI to help write incident reports based on body-worn camera footage. As The Post and Courier’s Anna Sharpe reports, IOP PD is using a system that transcribes and analyzes scenes on body-cam footage, and it’s a major time-saver for the small police force. Which is one generally accepted, and benign, purpose of AI: saving time. So far, so good. Writing reports eats up a lot of officers’ time — time better spent rounding up bad guys. But the American Civil Liberties Union is questioning the transparency and accuracy of such reports, because, well, AI has its limitations. Ever seen an AI-generated deepfake video of someone that looks real … other than the fact that he has seven fingers on one hand? Point is, even if AI is way more accurate than flawed human beings, it sometimes fails spectacularly. There’s the story about the car dealership chatbot that was talked into selling a car for $1 (the buyer lost in court) or, more famously, New York’s AI chatbot telling food and beverage managers it was fine to take employees’ tips and that landlords could discriminate against tenants based on their profession. It was, uh, shut down. When it comes to AI having a hand in the justice system, we should proceed with caution. With AI, it’s garbage in, garbage out — which means any program is only as good as the information it has. And it gets some of that information from the internet. Yeah. Dave, it looks like your suspect has a gun …. Solicitor Scarlett Wilson has studied this a bit, and says it’s a good idea for AI to generate initial incident reports, what with the busy schedules of officers. That is, so long as officers edit those reports closely and add their own observations … which often aren’t picked up on camera. “The two dangers are incompleteness and inaccuracy,” Wilson says. “It is critical that the original source of the information is retained and that officers double check the AI summary. As long as we — and the defense — have the original source, we have the best evidence. An over-reliance on AI could lead to incomplete reports because not everything officers experience is captured on video.” In other words, defense attorneys will scrutinize such reports and use any discrepancies to impeach officers as witnesses. Over-reliance on AI summaries, Wilson says, could lead to trouble — like bad guys winning trials they shouldn't. But, other than that, what could go wrong? Wilson’s analysis should put us at ease over Officer AI on the IOP. The machines aren’t taking over (yet), but they have found their way into the criminal justice system. If it helps, great. But if Hal’s buddies start causing problems, they should be relegated to the thriving online racoons-on-trampolines video business.