Partager:

JAKARTA - Meta Platform Inc's independent supervisory board said on Thursday, October 20 that starting this October it could decide to implement a warning screen, marking content as "disturbing" or "sensitive".

The board, which already has the ability to review user appeals to remove content, said it would be able to make binding decisions to implement a warning screen when it "recovers or recovers eligible content", including photos and videos.

Separately in its quarterly transparency report, the board said it received 347,000 appeals from Facebook and Instagram users worldwide during the second quarter ending June 30.

"Since we started receiving an appeal two years ago, we have received nearly two million appeals from users around the world," the board's report said. "This shows a continuous demand from users to appeal Meta content moderation decisions to independent bodies."

The supervisory board, which includes academics, rights experts, and lawyers, was created by the company to decide on a fraction of the difficult content moderation appeal, but could also advise on site policies.

Last month, they objected to Facebook's abolition of newspaper reports about the Taliban being deemed positive, in favor of users' freedom of expression, and said tech companies were too dependent on implemented automated moderation.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)