JAKARTA - A deeper investigation into Facebook's lack of controls to prevent misinformation and abuse in languages other than English is likely to leave people "more shocked" about the potential harm the social media company is causing. The statement was conveyed by Facebook whistleblower, by Frances Haugen to Reuters.

Haugen, a former product manager at Facebook subsidiary Meta Platforms Inc, speaks at the Reuters Next conference on Friday, December 3.

He left the company last May with thousands of internal documents he leaked to the Wall Street Journal. The documents lead to a series of articles that appeared last September detailing how Facebook learned that its app had helped spread content that was divisive and endangered the mental health of some of its young users.

In the interview with Reuters, Haugen also noted that Facebook also knows it has too few workers or human resources, with the language skills needed to identify posts or censor inappropriate posts from users in some developing countries.

“People who use the platform in languages other than English are using a “raw and dangerous version of Facebook,” Haugen said.

Facebook has consistently said it disagrees with Haugen's portrayal of internal research. They are also proud of the efforts that have been made to stop abuse on the platform.

Haugen said Facebook should be asked to disclose which languages its technology security systems support, otherwise "Facebook will do ... minimally to minimize PR risks," he said.

An internal Facebook document published by Haugen also raises new concerns about how social media may be failing to take action to prevent the spread of misleading information.

Haugen said the social media company knew it could introduce "strategic friction" to slow users down, before re-sharing posts. For example by requiring users to click on a link, before they can share the content. But he said the company avoided taking such action to preserve profits.

Legal and internet experts said in a separate panel at the Reuters Next conference last Friday that such measures could help to encourage users to reconsider certain content they are about to share.

"In orchestrating speech, you give states the power to manipulate speech for their own ends," said David Greene, director of civil liberties at the Electronic Frontier Foundation.

The documents published by Haugen have led to a series of hearings in the US Congress. Adam Mosseri, head of Instagram app Meta Platforms, will testify next week about the app's effect on young people.

Asked what he would tell Mosseri if given the chance, Haugen said he would question why the company didn't release more of its internal research.

"We have evidence now that Facebook has known for years that it is harming children," he said. "How can we trust you going forward?".


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)