In early January 2025, a schoolteacher and former soldier, Jonathan Bates, aged 54, was sentenced to five years imprisonment after creating fake sexual images and profiles on sexual services websites. There were several female victims, all of whom knew or worked with Bates. Indeed, the victims collectively solved the long-running case as they realised that he was the common denominator between them.
The ability to create false images and the use of AI have already become tools of the trade for nasty, fixated individuals like Jonathan Bates, [source]
However, organised crime is posing an increasing threat. Fraud targeting commercial organisations using AI-generated deepfake audio and videos is becoming more common.
To safeguard you or your business from falling victim to Deepfake fraud, here are some key strategies to apply:
· Two-Factor Authentication: To add an extra layer of security, use two-factor authentication (2FA) for all your online accounts.
· Voice or Video Verification: For critical communications, request a live video call to verify identity rather than relying solely on pre-recorded messages that can be deeply faked.
· Verify Sources: Always double-check the source of any unusual communication you receive, especially if it involves personal or financial information. Rather than responding to unsolicited communications, contact the individual or organisation using verified contact details.
· Social Engineering Tactics: Be aware of the social engineering tactics that attackers use to manipulate people into revealing confidential information or performing actions that compromise security.
· Awareness: Keep informed about how deepfakes work. Awareness is your first line of defence against falling for deepfakes.
· Above all, be suspicious of unusual requests and unsolicited communications, especially those involving financial transactions or sensitive information. Verify any such requests through a secondary, trusted channel. One of our clients came close to transferring a large payment after accepting a fake voice message and personal email from the company's CEO; this was only prevented by awareness.
Contact CFL today for a free, confidential call to discuss how our dedicated, personalised service can support you.
Author: Andy Nightingale (F/2237)