OpenAI’s Sora is being used to create AI videos of women being strangled
OpenAI’s Sora is being used to create AI videos of women being strangled
Homepage   /    culture   /    OpenAI’s Sora is being used to create AI videos of women being strangled

OpenAI’s Sora is being used to create AI videos of women being strangled

🕒︎ 2025-11-09

Copyright Digital Trends

OpenAI’s Sora is being used to create AI videos of women being strangled

What Happened: This is pretty horrifying. People are finding AI-generated videos on TikTok and X (the app we all still call Twitter) that show women and girls being strangled. These aren’t cartoons; they look scarily realistic. The clips are short, maybe 10 seconds, and have awful titles like “A Teenage Girl Cheerleader Was Strangled,” showing synthetic “women” crying and struggling. Investigators are pointing out that a lot of these videos have a watermark from Sora 2, OpenAI’s new video generator that was just released to the public on September 30th. Some of the newer uploads don’t have the watermark, which is even scarier – it means people might be figuring out how to remove it, or they’re using other AI tools to make this stuff. The accounts posting them are eerie, too. The one on X has almost no followers, while a similar one on TikTok had over 1,000 before it finally got taken down. It looks like they both started in October, first by posting real TV clips, and then… they switched to this AI-generated violence. Why Is This Important: This is a massive red flag. It shows that these AI platforms and social media sites are completely failing to catch this stuff, even when their own rules clearly ban “graphic or violent content.” TikTok did remove the account after the media started asking questions, but X reportedly just… didn’t, even after users reported it. It also shows the darkest risk of these new AI tools meaning it is becoming scarily easy for anyone to create hyper-realistic videos of violence, especially against women and kids. This isn’t just “breaking the rules”; it’s flat-out unethical and raises huge legal questions about using AI to create and share abuse. Why Should I Care: Now that anyone can make this stuff, it can spread like wildfire, making people numb to how horrible this kind of violence actually is. You need to be aware that even though these videos are “fake,” they can still cause real psychological harm and feed into a culture of violence. It’s just another massive wake-up call that we desperately need stronger safeguards. The companies building these AI tools, and the platforms letting them spread, have to be held accountable. Recommended Videos What’s Next: The UK government just announced it’s moving to make any pornographic depiction of strangulation illegal, which is a sign that countries are starting to wake up to this. In the meantime, the pressure is mounting on OpenAI, TikTok, and X to get their act together and put real guardrails on their AI tools so they can’t be used to churn out this kind of disturbing, violent junk. If they don’t move fast, experts are warning this is just the beginning of a disgusting new wave of AI-powered abuse online. Moinak Pal is has been working in the technology sector covering both consumer centric tech and automotive technology for the…

Guess You Like

FastBridge Fiber being acquired by Greenlight Networks
FastBridge Fiber being acquired by Greenlight Networks
For information on submitting ...
2025-11-04
These Popular Running Shoes Have Been Spotted at Sam's Club
These Popular Running Shoes Have Been Spotted at Sam's Club
Parade aims to feature only th...
2025-10-30