Mark Zuckerberg Is Accused Of Not Caring About The Amount Of Misleading Information On Facebook, The Important Thing Is That Traffic Increases
JAKARTA - Misinformation about COVID-19 is rampant on Facebook. As recently revealed in an internal document, CEO Mark Zuckerberg actively resists researchers finding ways to stop it.
While all Big Tech brands are facing a lot of heat issues right now, Facebook has become a huge target for a number of reasons. Facebook is also said to be one of the biggest sources of misinformation for the 2016 and 2020 US elections, is often filled with rampant conspiracy theories and is now home to relentless lies about COVID-19 and related vaccines.
This criticism escalated 11 times when The Wall Street Journal published a series of reports in September. Based on leaked internal documents from whistleblower Frances Haugen (a former data scientist at Facebook), the report reaffirms what many have long suspected about Facebook. For example, that Facebook actively creates division among users, is harmful to children, etc.
After Haugen testified against Facebook on Capitol Hill and submitted more documents to the SEC, the 17 largest news publications reviewed it and published their own findings (now known as the Facebook Papers).
In one of the articles published by The Washington Post, a very interesting tidbit was revealed. It's no secret that misinformation is everywhere on Facebook. Whether it's people making or sharing inaccurate posts about the pandemic, elections, or other current events, it's arguably one of the biggest problems facing sites today.
In April 2020, researchers created a system that could potentially reduce "coronavirus-related misinformation" by 38 percent. After being given this information, Mark Zuckerberg refused and allowed false information about COVID to continue to spread.
Based on the report, researchers are looking at ways to reduce the "hate speech, nudity, graphic violence, and misinformation" that flood people's News Feeds on Facebook. The solution found by the researchers would essentially limit the promotion of content deemed misleading.
SEE ALSO:
As explained, "Researchers propose limits for increasing the content that the newsfeed algorithm predicts will be reshared, as serial 'resharers' tend to correlate with misinformation." After presenting this solution to Zuckerberg — and proving it could reduce COVID-19 misinformation by nearly 40 percent — Zuckerberg dismissed the researchers' accusations.
Why? According to Anna Stepanov, director of the research project, "Mark doesn't think we can spread our wings." More specifically, Zuckerberg's response to the team was that "We will not launch if there is an exchange of materials with MSI."
MSI is Facebook's internal password for 'meaningful social interaction'. This is basically how Facebook measures how people use and interact with the site. If MSI goes up, it means people are spending more time on Facebook and engaging with other users. If the MSI goes down, the opposite happens. In other words, Zuckerberg is neglecting tools to drastically prevent COVID misinformation because it has the potential to disrupt website traffic.
That's also not too surprising given Facebook's previous actions. This is a company that has been proven time and again to put its own interests above the interests of others. Whether it's acknowledging that the “Like” button is causing anxiety to younger users, or its own Board of Trustees acknowledging there are transparency issues, Facebook has a lot of issues that need to be addressed. Instead, Facebook toyed with the name change and invested $10 billion in its fairytale metaverse.