JAKARTA - Since 2018, Bumble Inc., the parent company of Bumble, has been assisting US legislation in combating the online posting of nude images, often referred to as cyberflashing.

"At Bumble Inc., the parent company of Bumble, Badoo and Fruitz, safety has been a central part of our mission and a core value that informs product innovation and the company's roadmap," the parent company of the online dating app said in the announcement.

Then the following year, Bumble started using technology to protect its users by launching its Private Detector AI feature.

"We've leveraged the latest advancements in technology and Artificial Intelligence (AI) to help provide our user community with the tools and resources they need to have a secure experience on our platform," he added.

This Private Detector works automatically by blurring potentially naked images shared in chats on Bumble.

You will be notified, and given the freedom to decide whether you want to view the image or block the image.

Now, the Bumble Now Data team has written a white paper explaining its Private Detector AI technology and has created an open source version of the technology.

The company says that the open source of Private Detector AI is now available on GitHub.

"We hope this feature will be adopted by the wider technology community as we work together to make the internet a safer place," the company wrote in its announcement.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)