JAKARTA - Facebook is back in the spotlight following a study on the popularity of fake news on the platform. Compared to verified information, misinformation on the platform results in more clicks from users. Even with higher levels of engagement, especially among right-wing voters.
By analyzing a sample of more than 2,500 pages, researchers from New York University in the United States and the Université Grenoble Alpes in France found that misinformed content generates six times more engagement on Facebook than verified news.
The research was conducted on the Facebook platform, between August 2020 and January 2021, during the US presidential election period.
The study's findings, reported in The Washington Post, show that pages that publish fake news most frequently receive "Likes" more often, and generate more shares and comments.
According to Rebekah Tromble, Director of the Institute for Data, Democracy and Politics at George Washington University, who reviewed the researchers' findings, said the study "helps add to the growing body of evidence that, despite various mitigation efforts, misinformation has found a comfortable home and an engaged audience." , on Facebook."
With the help of NewsGuard and Media Bias/Fact Check, researchers looked at thousands of Facebook publishers from across the political spectrum, and their propensity to share content that was reliable or not.
The study then compared interactions on posts from pages known for fake news, from Occupy Democrat (leaning more to the left) to political commentator Dan Bongino (leaning more to the right) and conservative media outlet Breitbart.
The results highlight the more viral nature of far-right and far-left content compared to more factual political content. However, on the right wing fake news is more likely to circulate than any other political category.
This is a significant finding, because the study was conducted at the time of the US presidential election in November 2020.
The findings were not to the taste of Rafael Rivero, co-founder and president of Occupy Democrats. "We sometimes make minor mistakes and issue corrections immediately but we will never intentionally mislead our readers," he said in a statement to The Washington Post.
This isn't the first time Facebook's algorithms have been questioned. But a spokesperson for the social network lamented the fact that the study did not take into account post impressions, i.e., the number of times users saw the post displayed on their page.
"This report looks largely at how people engage with content, which shouldn't be confused with how many people actually see it on Facebook," Facebook spokesman Joe Osborne told The Washington Post.
"When you look at the most-reached content on Facebook, it's not at all what this study suggests," he added.
Recently, a well-known report on the platform's most viewed content, covering the first quarter of 2021, showed that an article from the Chicago Tribune linking a doctor's death to a COVID-19 vaccination topped the list of most viewed content.
This proved to be a major blow to Facebook, which has been accused by President Joe Biden of "killing people" for content it deems to be misleading about COVID-19 in the US.
While Facebook's algorithms don't necessarily support fake news or certain sections of the political spectrum, misinformation content seems to be valued more by users, which in turn makes it go viral on the social network.
According to the Washington Post: “Among publishers categorized as far right, those who share misinformation get the majority or 68% of all engagement from users.”
The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)