How a ‘nudify’ site turned a group of friends into key figures in a fight against AI-generated porn
By Jonathan Vanian
Copyright cnbc
In June of last year, Jessica Guistolise received a text message that would change her life.
While the technology consultant was dining with colleagues on a work trip in Oregon, her phone alerted her to a text from an acquaintance named Jenny, who said she had urgent information to share about her estranged husband, Ben.
After a nearly two-hour conversation with Jenny later that night, Guistolise recalled, she was dazed and in a state of panic. Jenny told her she’d found pictures on Ben’s computer of more than 80 women whose social media photos were used to create deepfake pornography — videos and photos of sexual activities made using artificial intelligence to merge real photos with pornographic images. All the women in Ben’s images lived in the Minneapolis area.
Jenny used her phone to snap pictures of images on Ben’s computer, Guistolise said. The screenshots, some of which were viewed by CNBC, revealed that Ben used a site called DeepSwap to create the deepfakes. DeepSwap falls into a category of “nudify” sites that have proliferated since the emergence of generative AI less than three years ago.
CNBC decided not to use Jenny’s surname in order to protect her privacy and withheld Ben’s surname due to his assertion of mental health struggles. They are now divorced.
Guistolise said that after talking to Jenny, she was desperate to cut her trip short and rush home.
In Minneapolis the women’s experiences would soon spark a growing opposition to AI deepfake tools and those who use them.
One of the manipulated photos Guistolise saw upon her return was generated using a photo from a family vacation. Another was from her goddaughter’s college graduation. Both had been taken from her Facebook page.
“The first time I saw the actual images, I think something inside me shifted, like fundamentally changed,” said Guistolise, 42.
CNBC interviewed more than two dozen people — including victims, their family members, attorneys, sexual-abuse experts, AI and cybersecurity researchers, trust and safety workers in the tech industry, and lawmakers — to learn how nudify websites and apps work and to understand their real-life impact on people.
“It’s not something that I would wish for on anybody,” Guistolise said.