JAKARTA - The social media platform Meta, Facebook and Instagram, will be investigated for alleged violations of EU online content regulations regarding child safety. This was said by EU regulators on Thursday, May 16, which could lead to major fines.

Tech companies are required to make more efforts to deal with illegal and harmful content on their platform under the European Union's Digital Services Act (DSA) which has been in effect since last year.

The European Commission said it had decided to open an in-depth investigation into Facebook and Instagram due to concerns that it was not dealing with the risk enough for children. Meta submitted a risk assessment report in September.

"The Commission is concerned that systems on Facebook and Instagram, including their algorithms, can stimulate abuse of behavior in children, as well as create so-called 'rabour holes'," EU executives said in a statement. "In addition, the Commission is also concerned about the age verification method applied by Meta." Regulator concerns related to children's access to inappropriate content.

Meta says they already have a number of online tools to protect children.

"We want young people to have a safe and age-appropriate online experience, and have spent a decade developing more than 50 tools and policies designed to protect them," said a Meta spokesperson. "This is a challenge facing the entire industry, and we look forward to sharing the details of our work with the European Commission."

Meta is already under EU scrutiny over election disinformation, which is of major concern ahead of next month's crucial European Parliament election. DSA violations could lead to fines of up to 6% of the company's global annual revenue.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)