Brits urged to create safe words as AI deepfake scams continue to rise
Brits urged to create safe words as AI deepfake scams continue to rise
Homepage   /    technology   /    Brits urged to create safe words as AI deepfake scams continue to rise

Brits urged to create safe words as AI deepfake scams continue to rise

Eilidh Farquhar,Niamh Kirk 🕒︎ 2025-10-21

Copyright dailyrecord

Brits urged to create safe words as AI deepfake scams continue to rise

Families across the UK are being urged to create safe words and passwords as fake videos scams have been continuing to surge online alongside the advancements in artificial intelligence technology. Worryingly for the average Joe, it is not just celebrities being targeted, as hackers are also finding ways to attack everyday people and their families. With AI becoming more sophisticated, it has become increasingly difficult to spot the difference between real and generated images and videos. As these 'deepfake' scams continue to rise, there has been a growing concern for public safety. For those who are frequent users of TikTok , it is likely that you have come across an AI-generated video of a notable figure or celebrity that has taken you a while to figure out if it is fake or not. Some current videos doing the rounds include Jake Paul, Tupac and Robin Williams. While some might see this as a bit of a laugh, many people are now urging families and friends to agree on 'safe words' and passwords only they would know to help provide extra protection against impersonation scams, reports the Mirror. Given how easy it can be for savvy hackers to steal someone's personal data, some crooks are now using this information to clone people and their voices to target victims through fake video calls to demand money. These worrying scams are easy to fall for as one TikTok user - named @chelseasexplainsitall - showed followers how simple it can be to play around and make your own fake videos through an OpenAI app called Sora. Sora is a text-to-video platform that can create realistic videos and currently has over one million downloads. Able to create videos in less than a minute, this app can put people into a range of situations they have never been in. After using the app, Chelsea has provided families with vital advice to protect themselves from scams. She said: "You and your family need a safe word." While deepfake videos are able to mimic your face, voice and mannerisms, they aren't able to detect or understand secret words or phrases. She added: "These deepfakes are crazy, the scams are going to be insane. It's so realistic, it's actually frightening." Another problem that comes with deepfake videos is the use of deceased celebrities and public figures. Experts are concerned that the use of celebrities who have passed away could spread misinformation. Additionally, these people cannot consent or opt out of AI videos. A spokesperson for OpenAI told NBC News : "While there are strong free speech interests in depicting historical figures, we believe that public figures and their families should ultimately have control over how their likeness is used. "For public figures who are recently deceased, authorised representatives or owners of their estate can request that their likeness not be used in Sora cameos." Chelsea's video currently holds over 41,000 likes and over 700 comments, with users sharing their concerns over the rise of AI scams. One person wrote: "Eventually, we will not be able to differentiate what’s real and what’s not real." With another person saying: "Guys.. When I tell you. As someone working in banking. Protect yourself and your loved ones." A third user also added: "Thank you for this PSA!! We need AI regulation, I’m so concerned for the future. We need more data literacy education to be able to distinguish from real and AI-generated content. Content should be flagged with a label so people know upfront."

Guess You Like

How Google Cloud’s AI partner ecosystem is racing ahead
How Google Cloud’s AI partner ecosystem is racing ahead
Artificial intelligence has a ...
2025-10-21