JAKARTA - Meta has dozens of suggestions from its internal Oversight Board, regarding revisions to its cross-check program, which protects high-profile users (VIPs) from the company's automated content moderation system.

The Board of Trustees, the independent body that reviews Meta content moderation decisions, issued 32 suggestions for changing the program, which places content from VIP users in a separate moderation queue from the automated queues the company uses for regular users.

However, instead of being removed, flagged content from certain public figures such as politicians, celebrities and athletes is left awaiting further human review. In some cases, they often post content that violates the rules without consequence.

In response, Meta agreed to adopt 26 of the Oversight Board's 32 suggestions, but declined to implement changes that would increase transparency around who is in the program.

"(Companies will cross-check) more transparently through regular reporting and refine our criteria for inclusion on the list to better take into account the interests and equity of human rights," said Meta Global Affairs President Nick Clegg, quoted from The Hill, Monday, March 6 .

"We will also change our operational system of cross-examination to help reduce delinquent review requests and reduce the time required to review cases," he added.

The Oversight Board review was conducted in direct response to an article published by the Wall Street Journal in 2021, from which the reviewer was exempt.

In their decision, the Board of Trustees acknowledged the many challenges inherent in moderating content at scale, while content review systems must treat all users fairly, the program grapples with the broader challenges of moderating very large volumes of content.

For example, at the time of request, they stated Meta was making a massive daily moderation attempt of about 100 million, but 99 percent would only generate one million errors per day.

The Board of Trustees found the cross-examination program to be less precise by advancing Meta human rights commitments and more directly structured to address business concerns.

They urged companies to radically increase transparency about the cross-check system and how it works.

In the case of cross-checking, Meta says, they will refine their processes to temporarily remove or hide potentially harmful content while awaiting additional review.

Companies will also work to ensure content reviewers have appropriate language and regional expertise, where possible.

However, the Oversight Board recommended that Meta establish clear and transparent criteria for users who are entitled to additional protection from a human rights perspective and which are included for business reasons.

To that end, Meta says it will refine its criteria to draw out the distinction, but there are currently no plans to publish more specific criteria to determine whether a user qualifies, as doing so could make the system vulnerable to manipulation.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)