Partager:

JAKARTA - Facebook and Chrome users are targets for malicious browser extensions that use the well-known artificial intelligence (AI)-powered chatbot name, ChatGPT.

On March 8, 2023, Guardio Labs researcher Nati Tal stated in a blog post on Medium that "Chrome's effectiveness that allows rapid access to fake ChatGPT functionality is found to have stolen a Facebook account and installed a hidden account backdoor."

In the Medium blog post, Tal also noted the use of "bad and voiceless backdoor Facebook apps that provide super-administrator access to threat actors." This extension can also harvest victims' browser cookies.

Guardio took action on Twitter to warn readers about this dangerous campaign.

A fake browser extension named "Quick access to ChatGPT" could break into a high-end Facebook account to create a "culled Facebook bot account". The threat perpetrator then "issued more sponsored posts and other social activities in the name of the victim's profile and issued business account credit money."

In the blog post it is also speculated that once the perpetrators access the victim's data, they will likely "sell it to the highest bidder as usual."

Thousands Of Facebook Accounts May Have Been Compromised

In this dangerous campaign, thousands of Facebook accounts may have been successfully kidnapped. In the blog post, it was stated that there were "more than 2000 users who installed this extension every day since it appeared at 03/03/2023."

In addition, Tal wrote that any individual who installs add-on "will lose its Facebook account and maybe this is not the only damage," suggesting that other consequences could arise from the existence of this extension.

Dangerous Apps Have Been Removed From Chrome

Although thousands have downloaded this fake browser extension, it has now been removed from the Google Chrome Store, preventing further attacks via Chrome-based downloads. However, it is not yet known how many people have been affected by this campaign, but the number of installations is a definite concern.

ChatGPT Names Are Often Used By Fraudsters

Since the increase in ChatGPT to fame, his name has been repeatedly used by cybercriminals to gain the trust of potential victims. Whether it's a fake ChatGPT-related token or a malicious Chat GPT brand extension, the popularity of this AI-powered chatbot is without hesitation being used by malicious actors to steal data and money.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)