Artificial Intelligence (AI) tools such as face swaps are now being used in Mission Impossible-style cyber-enabled financial crimes. The South China Morning Post reports that last month criminals defrauded a multinational Hong Kong firm of HK$200 million (US$26 million) by using deepfake video technology.
The cybercriminal gang initially sent a message to an employee in the finance department of the unnamed company, inviting him to a video conference via a message purporting to be from the organization’s chief financial officer (CFO). While on the video conference, the employee was joined by what looked and sounded sufficiently like his CFO and other colleagues to convince him to make a fraudulent transfer of company funds.
“I believe the fraudster downloaded videos in advance and then used deepfake technology to add fake voices to use in the video conference,” said Hong Kong police senior superintendent, Baron Chan Shun-ching. “We want to alert the public to these new deception tactics. In the past, we would assume these scams would only involve two people in one-on-one situations, but we can see from this case that fraudsters are able to use AI technology in online meetings, so people must be vigilant even in meetings with lots of participants.”
The Hong Kong police warning is extremely timely as cybercriminals are currently enthusiastically adopting deepfake video technology as the latest weapon in their ever-growing cyber-arsenals. Baron Chan Shun-ching also claims to be currently investigating half a dozen other similar cases where deepfake video was employed.
According to a new report from online identification company iProov: “Innovative bad actors are using advanced AI tools, such as convincing face swaps in tandem with emulators and other metadata manipulation methodologies (traditional cyber-attack tools)… Face swaps are now firmly established as the deepfake of choice among persistent threat actors.”
Sevenfold rise in face-swap attacks in 2023
The iProov Threat Intelligence Report 2024: The Impact of Generative AI on Remote Identity Verification reports a staggering increase of 704 percent in face-swap injection attacks during the second half of 2023. IProov also reports that 47 percent of the deepfake gangs were formed during 2023 when indiscriminate attacks each month ranged from 50,000 to 100,000. The most common face swap tools being used by cybercriminal groups are currently SwapFace, followed by DeepFaceLive and Swapstream.
The report also details the modus operandi used by some of the leading deepfake villains. Voltaire, for example, is “a highly skilled individual” who employs highly cost-effective and reproducible methods to carry out attacks. Voltaire and his/her accomplices have been known to capture a person’s facial image and overlay it digitally onto a body standing in front of a patterned backdrop. The resulting image is then displayed on a separate large-screen device, using advanced techniques to simulate life-like movements. Once this is accomplished, the image is recorded by a camera device. Voltaire’s primary motivation is financial gain, frequently targeting banks in Latin America.
Staff across all organizations urgently need to be educated not to automatically assume that a video call is bona fide just because it looks and sounds real and should instigate the same checks they would on an incoming unexpected email.