Can you imagine receiving a video call from your CEO asking you to approve an urgent transfer?
He moves his head, gestures, speaks with his exact voice and seems totally real.
There’s only one problem: it’s not him.
This is already happening. Deepfakes – videos generated by artificial intelligence that imitate human faces and voices – are becoming one of the favourite tools of cybercriminals.
What are deepfakes?
The term comes from deep learning and fake.
Through neural networks and training with real images and audio, AI is able to create hyperrealistic videos where a person seems to say or do something that never happened.
Until recently it was a movie thing. Today, a few minutes of public video or audio (like those on social networks or recorded video calls) are enough to generate imitations almost impossible to distinguish.
Real cases that have already occurred
- Fake manager in Hong Kong (2024): an employee transferred more than 25 million dollars after receiving a video call from his “financial director”… which was actually a deepfake.
- Cloned voice of a CEO: cybercriminals used AI-generated audio to order an employee to send funds to a fraudulent account.
- Political and media deepfakes: manipulated videos of public figures are spread saying things they never said, affecting reputations and electoral decisions.
- Support scams: Fake videos or avatars are used to convince victims to install software or hand over passwords.
- Fake candidates in interviews: criminals used deepfakes in video calls to pretend to be job seekers and access sensitive corporate data.
- Simulated banking provider: a British executive transferred more than £500,000 after a video call with a “provider” whose face and voice were false.
- Cloned influencers: fake videos of well-known figures were sent asking for commercial collaborations; everything was AI-generated content.
- Supplanted relatives: victims received calls or videos from “children” or “grandchildren” with cloned voices asking for urgent money.
Why do they work so well?
Because they appeal to authority and urgency, the two psychological factors most exploited by scammers.
If the person you see and hear seems to be your boss or an important client, and also tells you that “it’s urgent”, it’s easy to lower your guard.
Deepfakes mix advanced technology with social engineering, and that makes them dangerously compelling.
How to protect yourself
-
Always check through another channel.
Before acting on an unusual order, confirm with a call, message or face-to-face meeting. -
Establish clear internal protocols.
Defines authorisation limits for payments or access to data, even if the orders seem to come “from above.” -
Train your team.
The more your employees know about these techniques, the more difficult it will be to deceive them. -
It uses multi-factor authentication (MFA).
Even if they steal a visual or voice identity, they will not be able to access without the second factor. -
Be wary of urgency.
Time pressure is the best friend of fraud.
We help you armour your team
At OptimalPyme we teach you how to recognise signals of digital manipulation, establish secure internal protocols and train your team against deepfakes, phishing and other modern threats.
Don’t wait for them to call you with your boss’s (false) face.
Train your team today and reduce the risk tomorrow.
— CEO of Optimalpyme






