Lake County porn case highlights dark side of new technology
Lake County porn case highlights dark side of new technology
Homepage   /    science   /    Lake County porn case highlights dark side of new technology

Lake County porn case highlights dark side of new technology

🕒︎ 2025-11-03

Copyright Chicago Tribune

Lake County porn case highlights dark side of new technology

The case of Ricardo Russell, a former Deerfield High School janitor described in court documents as a dealer of child pornography, highlights the dark side of a new technology and a rising concern for the Lake County State’s Attorney’s Office — artificially generated sexual images of children. Russell faces 25 charges, including 13 for child pornography and 12 for “obscene depiction of a purported child,” authorities said. According to a letter to parents from Township High School District 113 Superintendent Chala Holland, Russell had been let go from his job as a third-shift janitor on July 16, 2024, and the district had not learned about the allegations against him until news recently broke about the charges. In court documents listing Russell’s charges and describing the investigation that led to his arrest, authorities described him as a dealer of child pornography, and compared him to a drug dealer slinging “dime-bags on a street corner.” Russell is alleged to have had numerous group-chat messages at once, receiving hundreds of images, videos and conversations, and then passing materials along to other users. That includes images of children under 13 years of age, which, upon closer inspection, investigators say were created with artificial intelligence. A relatively obscure application is named in documents that reportedly can generate AI images, among other offerings. According to the documents, forensic analysis was able to recover evidence of child sexual assault material within Russell’s account with the app, although documents don’t indicate whether its image-generation tools were used to generate any of the pictures. Venkatramanan Siva Subrahmanian, a professor of computer science at Northwestern University with a focus on the intersection of AI and security problems, said such use of AI is a growing concern as the technology becomes more ubiquitous. The issue goes back years, Subrahmanian said, recalling apps that stirred controversy with their ability to take images of clothed people and create nude images of them. While not necessarily accurate, the content was “very realistic.” Today, in the era of realistic AI image generators, such tools are more available than ever. On the federal level, he said there’s now the TAKE IT DOWN Act, passed earlier this year, which criminalizes what is commonly described as revenge porn, such as deepfakes, which are highly convincing images of real people created using artificial intelligence. The act requires websites and social media services to take the material down at the request of the victim within 48 hours. But what has been “more tricky,” Subrahmanian said, has been putting restrictions on the companies that can create these deep fakes, and on sexual content created of fully fictional people. There are also questions about the lines between art and pornography, Subrahamanian said, issues that require larger conversations. “How should one state draw the line versus another state?” he said. “How should the federal government draw the line, especially in cases where the person in the imagery being portrayed doesn’t exist?” But while states may have “different perspectives” on what type of sexual content is acceptable, Subrahamnain said, “everybody” would likely agree that such a depiction of a child must be forbidden. The roots of the issue can be seen as far back as 2002, when the U.S. Supreme Court in Ashcroft v. Free Speech Coalition struck down provisions of the Child Pornography Prevention Act of 1996 for being overbroad. In the dissenting opinions, judges expressed concerns about how rapidly advancing technology could be used to create images indistinguishable from real child pornography images. Illinois, for its part, has laws in place regarding the use of AI-generated child sexual abuse material (CSAM), passed last year, which clarifies that the state’s child pornography laws apply to images and videos created by AI. The legislation bars the use of AI to create child pornography, making it a felony to be caught with artificially created images. Sara Avalos, of the Lake County State’s Attorney’s Office, said there’s “growing concern” surrounding the use of artificial intelligence-generated imagery depicting the sexual abuse of children. “Even when images or videos are computer-generated or considered to be ‘not real,’ they still contribute to the exploitation of children by creating a demand for the production of abusive material,” Avalos said. Finding the line Some of the tools being used — deep fake image generation and video generation algorithms — are put out by respectable companies, Subrahmanian said. Although these companies “make an investment and effort” into preventing the misuse of the tools, there are hundreds of millions of images and videos being posted online. “This means mistakes will happen,” he said. There’s also been recent talk from OpenAI, one of the more prominent artificial intelligence organizations, saying it will begin allowing adult content, like “erotica,” according to organization head Sam Altman, for adult users. It’s something about which Subrahmanian is especially wary. “The proliferation of tools and services to do this is not a good thing,” he said. But the use of AI for CSAM is the dark side of a technology. Subrahmanian gave a much more mundane and legal use, such as taking an image of a house and asking the AI to paint it in the style of Monet, or perhaps creating 3D flight simulation environments. Subrahmanian can even imagine potential scenarios where nude forms would be generated in non-sexual contexts, such as the medical field. “These kinds of applications should be supported and should not be banned,” he said. “But at the same time, the challenge is to figure out how a nude image is something that is crossing the line of the law, or what the population thinks is immoral.”

Guess You Like

Institutions move to provide data now found in CDC MMWR journal
Institutions move to provide data now found in CDC MMWR journal
In the latest bid to plug gaps...
2025-10-21
News24 | Here are the Daily Lotto and Daily Lotto Plus numbers
News24 | Here are the Daily Lotto and Daily Lotto Plus numbers
TCF vendors Exponential Inter...
2025-10-20