Following up on the deletion of 63,000 Instagram accounts in Nigeria involved in sexual fraud and extortion, Meta announced it had developed a new signal to identify accounts potentially involved in sexual extortion.

Through this new signal, the tech giant is committed to taking more effective steps to help prevent these accounts from finding and interacting with teenagers.

In addition, Meta is also starting to test the nudity protection feature on its device on Instagram DMs, which will blur detected images containing nudity.

This feature is expected to encourage people to be careful when sending sensitive images, and direct people to safety tips and resources, including NCMEC's Take It Down platform.

Previously, Meta discovered a strategic network disruption to two sets of accounts in Nigeria affiliated with Yahoo Boys and attempted to engage in financial sextortion scams.

Yahoo Boys itself is an organized cybercriminal, most of which operates in Nigeria and commits various types of fraud including extortion.

To crack down on this crime, Meta announced that it has successfully removed about 63,000 Instagram accounts in Nigeria seeking to be directly involved in financial sexual extortion fraud.

This includes a smaller coordinated network consisting of about 2,500 accounts, which can be directly connected to groups of about 20 people.

Meta million has removed about 7,200 assets, including 1,300 Facebook accounts, 200 Facebook Pages, and 5,700 Facebook Groups, which are also based in Nigeria, providing tips for fraud.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)