JAKARTA - Instagram is rumored to be testing a new security feature commonly used on mobile phones, namely a face verification system through selfie videos.

According to a report uploaded by social media consultant, Matt Navarra, this new verification of course requires the user's face from various angles, to ensure that they are the original owner of the account and are actually humans, not bots.

According to a report from XDA Developers, quoted from The Verge, Wednesday, November 17, the company actually tested the feature last year but encountered technical problems.

However, some users have recently reported being asked to take a selfie video to verify their account. Like Twitter user @bettinamak who uploaded a screenshot on his Twitter.

He @bettinamak, asked to take a selfie video, and the feature has to show all corners of his face, to prove that he is a real person.

Unfortunately, this move might come as a surprise, considering that Meta's recent announcement would turn off one of its Face Recognition features. However, as the company has reiterated, it only turns off certain Facebook features, not Meta's use of facial recognition in its entirety.

In response to this, Instagram via its official Twitter explained in more detail how this feature will work. They will ask accounts that have suspicious behavior for selfie video verification. For example, an account that follows many other accounts quickly.

The company also confirmed that this feature does not use facial recognition, but that the Instagram team will review the video carefully, and selfie videos from users will also be deleted after 30 days.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)