Latest Deepfake Technology Able To Labor Binance Security Systems, Here's What You Should Know
Deepfakes are made using artificial intelligence tools that use machine learning. (photo:dok. Pixabay)

JAKARTA - The warning has been issued by Binance's Head of Security, Jimmy Su, that the deepfake technology used by crypto fraudsters to trick the verification process of Know Your Customer (KYC) on crypto exchanges like Binance will be more sophisticated in the future.

Deepfakes are made using artificial intelligence tools that use machine learning to create convincing audio, images, or videos by displaying a person's resemblance. Although there is a legitimate use for this technology, this technology can also be used for fraud and hoaxes.

Jimmy Su said in an interview with Cointelegraph that scammers are increasingly using this technology to try to bypass customer verification processes on crypto exchanges.

"Hacker will search for pictures of victims online. Based on the picture, using deepfake tools, they can produce videos to conduct ports," said Su.

Su said the tools had become so sophisticated that they could even respond correctly to audio instructions designed to check whether the caller was human, and could do it in real-time.

"Some verification requires users, for example, to blink their left or look to the left or right, look up or down. Deepfakes are currently sophisticated enough so they can carry out these commands," he explained.

However, Su believes that the fake videos have not yet reached a level where they can deceive human operators.

"When we look at those videos, there are certain parts that we can detect with our human eyes," for example, when the user is asked to turn the head to the side, said Su.

"AI will overcome it over time. So this is not something we can always rely on," he said.

In August 2022, Binance's Head of Communications, Patrick Hillmann, warned that an advanced hacker team used its previous news interviews and television performances to create a "deepfake" version of him.

Hillmann's deepfake version was then used to hold Zoom meetings with various crypto project teams, promising the opportunity to list their assets in Binance - in exchange for costs, of course.

"This problem is very difficult to overcome," said Su when asked about how to fight such an attack.

"Although we can control our own videos, there are videos out there that don't belong to us. So one of the things that can be done is user education," added Su.

Binance plans to release a series of blog posts aimed at educating users about risk management.

In an early version of the blog post that includes parts on cybersecurity, Binance says that it uses artificial intelligence algorithms and machine learning for their purposes, including detecting unusual login patterns and transaction patterns as well as other "abnormal" activity on their platform.

Binance also plans to improve user education on security and risks. They will release a series of blog posts providing users with information about safe risk management practices in crypto trading.

Su admits that this is not an easy matter to overcome, given the level of advances in deepfake technology. However, he stressed the importance of user education as the first step in protecting themselves from fraud.

In addition, Binance has introduced artificial intelligence tools and machine learning in their systems to help detect suspicious activity and protect users from security threats. However, they realize that this technology will continue to develop and needs to be continuously improved to keep up with increasingly sophisticated fraudulent developments.

In facing this challenge, Binance is working hard to maintain the security of its platform and protect users from deepfake attacks and other fraudulent attempts.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)