Politics

Hackers can now inject AI deepfakes directly into iOS video calls using this tool – here’s how to stay safe

By Efosa Udinmwen

Copyright techradar

Hackers can now inject AI deepfakes directly into iOS video calls using this tool - here's how to stay safe

Skip to main content

Tech Radar Pro

Tech Radar Gaming

Close main menu

the business technology experts

België (Nederlands)

Deutschland

North America

US (English)

Australasia

New Zealand

View Profile

Search TechRadar

Expert Insights

Website builders

Web hosting

Best web hosting
Best office chairs
Best website builder
Best antivirus
Expert Insights

Don’t miss these

Cyber Security
AI impersonation scams are sky-rocketing in 2025, security experts warn – here’s how to stay safe

Inside the deepfake threat that’s reshaping corporate risk

Addressing the new executive threat: the rise of deepfakes

AI chatbot users beware – hackers are now hiding malware in the images served up by LLMs

I am an AI expert and here’s why synthetic threats demand synthetic resilience

Can you spot an AI-generated scam?

AI-powered phishing attacks are on the rise and getting smarter – here’s how to stay safe

Inside the billion-dollar identity fraud ecosystem

Beyond romance fraud: The rising threat of social media scams

How AI fraud Is evolving faster than AP & procurement defenses

Hook, line and sinker: how to detect and protect your business from phishing attacks

Experts warn this top GenAI tool is being used to build phishing websites

AI tools are making social engineering attacks even more convincing, and I fear that this is only the beginning

Criminals are targeting hundreds of legitimate banking & crypto apps using an advanced virtualization technique — here’s how to stay safe

Norton adds deepfake protection to mobile apps in push to make sure you don’t get caught out by scams

Hackers can now inject AI deepfakes directly into iOS video calls using this tool – here’s how to stay safe

Efosa Udinmwen

22 September 2025

Jailbroken iOS devices are prime targets for sophisticated deepfake video injections

When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

(Image credit: iOS Hacker)

Deepfake injection attacks bypass cameras and deceive video verification software directly
Face swaps and motion re-enactments transform stolen images into convincing deepfakes
Managed detection services can identify suspicious patterns before attacks succeed

Digital communication platforms are increasingly vulnerable to sophisticated attacks that exploit advanced artificial intelligence.

A report from iProov reveals a specialized tool capable of injecting AI-generated deepfakes directly into iOS video calls, raising concerns about the reliability of existing security measures.
The discovery reveals how quickly AI tools are being adapted for fraud and identity theft, while exposing gaps in current verification systems.

You may like

AI impersonation scams are sky-rocketing in 2025, security experts warn – here’s how to stay safe

Inside the deepfake threat that’s reshaping corporate risk

Addressing the new executive threat: the rise of deepfakes

A sophisticated method for bypassing verification
The iOS video injection tool, suspected to have Chinese origins, targets jailbroken iOS 15 and newer devices.

Attackers connect a compromised iPhone to a remote server, bypass its physical camera, and inject synthetic video streams into active calls.
This approach enables fraudsters to impersonate legitimate users or construct entirely fabricated identities that can pass weak security checks.
Using techniques such as face swaps and motion re-enactments, the method transforms stolen images or static photos into lifelike video.

Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Contact me with news and offers from other Future brandsReceive email from us on behalf of our trusted partners or sponsorsBy submitting your information you agree to the Terms & Conditions and Privacy Policy and are aged 16 or over.
This shifts identity fraud from isolated incidents to industrial-scale operations.
The attack also undermines verification processes by exploiting operating system-level vulnerabilities rather than camera-based checks.
Fraudsters no longer need to fool the lens, they can deceive the software directly.

You may like

AI impersonation scams are sky-rocketing in 2025, security experts warn – here’s how to stay safe

Inside the deepfake threat that’s reshaping corporate risk

Addressing the new executive threat: the rise of deepfakes

This makes traditional anti-spoofing systems, especially those lacking biometric safeguards, less effective.
“The discovery of this iOS tool marks a breakthrough in identity fraud and confirms the trend of industrialized attacks,” said Andrew Newell, Chief Scientific Officer at iProov.
“The tool’s suspected origin is especially concerning and proves that it is essential to use a liveness detection capability that can rapidly adapt.”
“To combat these advanced threats, organizations need multilayered cybersecurity controls informed by real-world threat intelligence, combined with science-based biometrics and a liveness detection capability that can rapidly adapt to ensure a user is the right person, a real person, authenticating in real time.”
How to stay safe

Confirm the right person by matching the presented identity to trusted official records or databases.
Verify a real person by using embedded imagery and metadata to detect malicious or synthetic media.
Ensure verification is in real-time with passive challenge-response methods to prevent replay or delayed attacks.
Deploy managed detection services that combine advanced technologies with human expertise for active monitoring.
Respond swiftly to incidents using specialized skills to reverse-engineer attacks and strengthen future defenses.
Incorporate advanced biometric checks informed by active threat intelligence to improve fraud detection and prevention.
Install the best antivirus software to block malware that could enable device compromise or exploitation.
Maintain strong Ransomware protection to safeguard sensitive data from secondary or supporting cyberattacks.
Stay informed on evolving AI tools to anticipate and adapt to emerging deepfake injection methods.
Prepare for scenarios where video verification alone cannot guarantee security against sophisticated identity fraud.
You might also like

These are the best free antivirus software in 2025
Here are also the best malware removal tools available at the moment
Ransomware and the UK’s proposed ban on payments: measured legal response or risk amplifier?

Efosa Udinmwen

Freelance Journalist

Efosa has been writing about technology for over 7 years, initially driven by curiosity but now fueled by a strong passion for the field. He holds both a Master’s and a PhD in sciences, which provided him with a solid foundation in analytical thinking. Efosa developed a keen interest in technology policy, specifically exploring the intersection of privacy, security, and politics. His research delves into how technological advancements influence regulatory frameworks and societal norms, particularly concerning data protection and cybersecurity. Upon joining TechRadar Pro, in addition to privacy and technology policy, he is also focused on B2B security products. Efosa can be contacted at this email: udinmwenefosa@gmail.com

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.

AI impersonation scams are sky-rocketing in 2025, security experts warn – here’s how to stay safe

Inside the deepfake threat that’s reshaping corporate risk

Addressing the new executive threat: the rise of deepfakes

AI chatbot users beware – hackers are now hiding malware in the images served up by LLMs

I am an AI expert and here’s why synthetic threats demand synthetic resilience

Can you spot an AI-generated scam?

Latest in Security

Hackers are using GPT-4 to build a virtual assistant – here’s what we know

Scammers build fake FBI crime reporting portals to steal personal info – warns FBI

Ransomware hackers could be targeting GoAnywhere MFT once again – here’s what we know

EU says ransomware to blame for attack which caused chaos at airports

CISA flags some more serious Ivanti software flaws, so patch now

This serious Microsoft Entra flaw could have let hackers infiltrate any user, so patch now

Latest in News

Windows 11 could bring back an old feature for wallpapers from Windows Vista – and it’s about time

How to watch British Open snooker on ITVX (it’s free)

New Windows 11 25H2 update is about to land on your PC – but where’s the excitement?

First trailer for The Mandalorian & Grogu reveals Sigourney Weaver’s mystery Star Wars character, Rotta the Hutt, and the return of The Rise of Skywalker’s best creature

Montblanc just released an e-notebook, and yes it’s staggeringly expensive

PureVPN Linux apps found to leak IPv6 traffic and mess with your firewall – here’s how to secure your data

LATEST ARTICLES

Huawei is planning a 256-core CPU monster to take on AMD EPYC and Intel Xeon range but it won’t land till 2028 – at least that’s the official line

First trailer for The Mandalorian & Grogu reveals Sigourney Weaver’s mystery Star Wars character, Rotta the Hutt, and the return of The Rise of Skywalker’s best creature

Windows 11 could bring back an old feature for wallpapers from Windows Vista – and it’s about time

I review home gadgets for a living, and this air circulator fan is hands-down the best thing I’ve tested this year – here’s why

Hackers are using GPT-4 to build a virtual assistant – here’s what we know

TechRadar is part of Future US Inc, an international media group and leading digital publisher. Visit our corporate site.

Contact Future’s experts

Terms and conditions

Privacy policy

Cookies policy

Advertise with us

Web notifications

Accessibility Statement

Future US, Inc. Full 7th Floor, 130 West 42nd Street,

Please login or signup to comment

Please wait…