JAKARTA Meta is taking new steps to protect children from predators on Instagram. The company promises to limit adult accounts featuring children on recommendations.
This account policy will also have an impact on accounts managed by adults for business purposes, such as artists or children's models. Their accounts will not be recommended if the photos or videos shared are children under 13.
"This includes adults who regularly share photos and videos of their children and adults, such as parents or talent managers, who manage accounts representing teenagers or children under 13 years of age," Meta said, quoted on Thursday, July 24.
Meta realizes that most adult accounts featuring children are harmless, but this still needs to be prevented. According to the technology company, there are some people who often abuse these accounts.
"Unfortunately there are people who might try to abuse it, leave sexual comments under their posts or ask for sexual images on DM (direct message), which clearly violates our rules," said Meta.
SEE ALSO:
Basically, this is done to limit the actions of predators. Meta had received a lawsuit in 2023 on charges of Facebook and Instagram being a predator market.
Meta is accused of facilitating the search for child abuse material by recommending children's accounts to a pedophile network. An algorithm like this was discovered by the Wall Street Journal some time ago.
Therefore, restrictions on the recommendation of these accounts are starting to be applied. Meta will also hide comments adults leave on children's posts if they have the potential to be suspicious. For example, leaving messages that direct to harassment.
The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)