CyberNews Briefs

A Harbinger Of Our Future: Reports Indicade Voice Deepfake Was Used To Scam A CEO Out Of $243,000

Adversaries in search of financial gain will innovate. We all need to accept that observation as a fact of life, meaning we should all stay agile and prepare for surprise. When it comes to deep fake video and audio, what is surprising is not that adversaries are using this technology, but that they used it so successfully so fast.

We are not sure the story below actually happened, in fact, odds are that this early report is not the full truth. But it is probably important to think through this now to mitigate future problems.

Here is more:

Threat actors recently used AI voice technology to masquerade as the CEO of a German enterprise in a phone call with the CEO of a UK subsidiary, according to a report by The Wall Street Journal. Because the AI solution generated a lifelike version of the German CEO’s voice with the correct accent and “melody,” the scammers managed to convince the UK executive to carry out an urgent transfer of €220,000 (around $243,000) to a bank account they controlled. The victim was told the account belonged to a Hungarian supplier.

When the attackers later contacted the same CEO to ask for an additional payment, the executive grew suspicious and the scheme eventually unraveled. The incident is the first known example of a successful deefake voice scam. Deepfakes are audio or visual content doctored by artificial intelligence (AI).

Read more: A Voice Deepfake Was Used To Scam A CEO Out Of $243,000

OODA Analyst

OODA Analyst

OODA is comprised of a unique team of international experts capable of providing advanced intelligence and analysis, strategy and planning support, risk and threat management, training, decision support, crisis response, and security services to global corporations and governments.