JAKARTA The Kenyan court revealed new evidence accusing Meta platform contractor Inc., Facebook's parent company, of ignoring threats from Ethiopian rebel groups against content moderators working for the platform. This moderator has previously blocked video posts from the group, which has led to them being targeted by threats.

In a court document filed on December 4 by the nonprofit Foxglove, it was stated that the Oromo Liberation Army (OLA) rebel group threatened an Ethiopian-focused moderator for deleting their posts on Facebook. One of the moderators, Abdikadir Alio Guyo, said in a statement that he received a threatening message warning that they would stop removing OLA content, or would face serious consequences.

"Since receiving the threatening message, I have lived in fear, even to visit my family in Ethiopia," said another moderator, Hamza Diba Tubi, who also received a similar threat that included his name and address and colleagues.

The OLA rebel group, which is a split of once banned opposition parties, has been accused by the Oromiya regional government of killing many civilians following the failure of peace talks in Tanzania in 2023.

The moderator who filed the lawsuit also accused the same contractor company, based in Kenya, of not being serious about handling their complaints. At the same time accusing moderators of falsifying the threat before finally conducting an investigation and sending one of the moderators publicly identified by the rebels to safety.

Last year, 185 content moderators sued Meta and two contractors, including Sama, for losing their jobs after they tried to form a union. The moderators also claim that they are blacklisted so they cannot apply for a similar job at a new contractor, Majorel, after Meta replaces service providers.

This case highlights Meta's challenges in working with content moderators around the world. The moderator is tasked with reviewing graphic content uploaded on the Meta platform, often facing direct threats and mental stress due to exposure to harmful content.

In addition, the court documents revealed that Meta ignored suggestions from hired experts to address hate speech in Ethiopia. Alewiya Mohammed, a moderator supervisor, claims to be stuck in an endless circle of duty to review hate content that technically does not violate Meta's policies.

This case, whose out-of-court settlement attempt failed in October 2023, could have a major impact on Meta's practice of globally tackling content moderators.

Meanwhile, in separate cases in 2022, Meta is also accused of allowing violent and hate posts from Ethiopia to develop on Facebook, which sparked a civil war between the federal government and Tigray regional authorities.Both Meta, Sama, and OLA representatives have not made official comments regarding these allegations


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)