Deepfake, a very dangerous attack tool

Author
Reinhold Zurfluh
Published
11. November 2024

Deception attempts via deepfakes can pose an enormous threat to companies. Recent deepfake scams have clearly demonstrated this. Sophisticated attacks of this nature not only highlight the vulnerabilities to which high-level individuals are exposed – but also emphasise the need for security training. Find out how you can protect yourself from deepfakes in this blog post.

Artificial intelligence can be used to create deceptively genuine counterfeits. This is fuelling fears that so-called deepfakes could pose a significant threat to digital society. Such developments mark a significant turning point in cyber security and require us to rethink our traditional approaches towards security.

Experts predict that by 2026, one in three companies will realise just how vulnerable their established identity verification methods have become in the face of increasingly sophisticated deepfake technologies. As technology blurs the lines between reality and fiction, the recent incident involving US Senator Ben Cardin clearly demonstrates the growing threat of deepfakes. What has happened?

Deepfake is a reality

At the beginning of October 2024, Cardin was the target of a sophisticated deception campaign. The fraudsters – posing as Ukrainian Foreign Minister Dymtro Kuleba – managed to conduct a Zoom call with the senator. The imitation was so convincing that Cardin and his colleagues were initially fooled – this demonstrating the frightening accuracy of modern deepfake technology. However, the fraud was exposed when the impersonator began to ask political questions that did not match Minister Kuleba.

The deviation from expected behaviour alarmed Senator Cardin and his team, whereupon they ended the call and alerted the authorities. This incident is not an isolated case. Deepfake scams are not only becoming more frequent – they’re also becoming ever more sophisticated. This is also shown by a second case – which didn’t go so smoothly for the victim:

Deepfake and Phishing: A dangerous combination

This hacking story took place not so long ago at a branch of a multinational company in Asia. The hackers first contacted the victim with a phishing email as the primary attack vector, followed by a fake video call.

In this case, enough information was available to create a supposed authority for a financial employee who transferred the equivalent of around 20 million Swiss francs in 15 transactions to five different bank accounts in Hong Kong – until the fraud was discovered. Police reported that the original phishing email had led to an online session about urgent financial transactions. The fraudsters had carefully orchestrated a conversation between several senior executives of the company, including the CFO.

However, the only real person who dialled into this session was the finance employee who then transferred the money. All others, including the CFO, were deepfakes. The entire conversation was pre-recorded and there was no natural interaction. Nevertheless, the finance employee believed that he had received the order for the transfer from Management. The urgent need for the transactions was emphasised during the meeting – whereupon the finance employee concluded that they actually needed to be carried out. There was a sense of urgency, a sense of obligation or need to fulfil, a strong motivation to act – and only one way out of the situation. A classic recipe for social engineering and phishing.

Deepfake and Phishing: Stay alert

No further details about the incident are known, but the fraud seems to follow the usual pattern. A phishing email poses as a company director and asks a finance employee to attend an urgent meeting to discuss important financial transactions. You can find out how the employee could have recognised the phishing email in our helpful phishing poster (you’re also welcome to distribute this to your employees – thus making them aware of this danger).

Phishing-Poster

Deepfake and Phishing: Not with me

Nobody should sit back and say: “That wouldn’t have happened in our company. Our corporate culture is such that we must and will always review a major transaction with Management via an additional channel.” That may well be true. However, you should never be fooled into thinking that skilful fraudsters won’t be able to access the data. Our simulated phishing attacks unfortunately confirm this all too often – and remember this: Cybercriminals know the tricks of their trade ...

The use of deepfakes is a powerful tool – so it should come as no surprise how the entire cybersecurity community braced itself for news about the first successful use of a deepfake for social engineering. It was only a matter of time. In the meantime, a fake person in a deepfake video or audio recording can hardly be distinguished from a real person.

Despite the increasing sophistication of deepfakes, there are characteristic features that make it possible to identify such counterfeits. Attentive observers can look out for subtle signs – such as unnatural eye movements or a lack of light reflections in the eyes. Anomalies such as double chins, eyebrows or facial edges sometimes also indicate a deepfake. In addition, inconsistencies in the background design or lighting can provide important indications of possible manipulation. But beware: The quality of deepfakes is improving day by day...

Deepfake and Phishing: How to protect yourself

There’s still another truth: This fraud could have been avoided by providing employees with appropriate training! Always pause and reflect. Trust, but check, is the motto. Be sure to use the chain of command to check transactions – especially if requests seem unlikely or unusual.

  • When dealing with insidious threat tactics such as phishing and deepfake, it’s important to show healthy scepticism and vigilance. Here are some basic behaviours:
  • Check unexpected messages, especially those concerning sensitive information or financial transactions.
  • Be suspicious of urgent requests or pressure attempts, which are often the hallmark of fraudsters.
  • Find out about the latest deepfake technologies and fraud techniques.
  • Use reliable verification and filtering measures.
  • Trust your instincts – if something feels strange, it probably is.

We’re facing a future where it’s going to be near impossible to distinguish between counterfeit and original.

Companies, your organisation and you as a private individual will suffer equally if trust in image, sound and text is fundamentally lost. You need to be prepared for incidents like the one described here.

Targeted awareness training against Deepfake and Phishing

Raising awareness among your staff about how to deal with information and current threats is essential to protect against deepfakes and phishing attacks – thereby strengthening your cyber resilience in the long term. By addressing your company’s security culture, you can encourage prudent cyber security behaviour on a targeted basis. A sustainable safety culture is the best protection for your company. Correct behaviour and a quick response can ward off deepfakes and phishing attacks. The added value of security awareness training is beyond question.

When properly communicated and established, security awareness can make the difference between a fatal click and the right course of action. But remember: security awareness doesn’t happen overnight and is usually more complex than expected. It’s therefore all the more important to get support from experts. Our experienced awareness team has been successfully supporting companies for many years when it comes to establishing a practised security culture. It’s also important to offer employees the opportunity to consolidate and verify the behaviour they’ve learned in regular phishing simulations – for instance with the world’s largest “Security Awareness Training” platform from KnowBe4.

Are you keen to discover KnowBe4’s security awareness training platform? Contact us for a no-obligation live demo. We look forward to boosting the security awareness of you and your employees!
Request demo

 

 

Source: KnowBe4 blog article

Caption: with AI generated image

Share article