Use Deepfake Theft Action In The UAE Won 35 Million Dollars

JAKARTA - Police in the United Arab Emirates are investigating a case in which AI was allegedly used to clone the voice of a company director and steal US$35 million (Rp492 million) in a massive theft.

While this is a stunning and unusual crime, it is not the first time fraudsters have used AI-based voice spoofing to carry out a daring heist.

An earlier example of such technology being used for similar scams occurred in 2019, when criminals in the UK were said to have used deepfake software to mimic the voice of the CEO of an energy company to fraudulently transfer about $243,000.

While artificial intelligence is expected to open a wave of opportunities in the years to come, the threats posed by technology are also very real.

Job loss driven by automation is often considered the most pressing issue with AI, but the technology also poses serious challenges in other areas, including privacy threats with the rampant use of facial recognition, and audio and video deepfakes created by manipulating voice and likeness.

While the former tends to gain attention in the mainstream media, the latter also poses a major threat, as this scam case shows.

As reported by Forbes, the most recent example of vote manipulation being used for fraud occurred in the UAE in early 2020, when criminals allegedly used AI to clone the votes of company directors to ask bank managers to transfer funds worth 35 million US dollars for acquisitions.

The bank manager duly made the transfer believing that everything was legit, only to later realize that it was an elaborate scam orchestrated by high-tech criminals. As it turns out, scammers have used 'deep voice' technology to scam managers and scam banks out of huge sums.

UAE Asks US Authorities for Help

According to court documents, investigators in the UAE are now seeking help from US authorities to track down $400,000 of stolen funds they believe are held in US bank accounts. The remaining funds are believed to be held in many different banks with different names in different countries of the world. According to UAE authorities, at least seventeen people were involved in the scheme, although their names and nationalities were not immediately clear.

Speaking to Forbes, Jake Moore, an expert at cybersecurity firm ESET said that audio and video 'deepfake' technology can be a real problem in the wrong hands, as deepfakes pose a "huge threat to data, money and business." Moore also adds that an increasing number of businesses are likely to fall victim to similar realistic deepfake audio scams in the future.