Not Only COVID-19, YouTube Is Now Also Fighting Vaccine Misinformation On Its Platform
YouTube will dispel misinformation regarding vaccine content. (photo: unsplash)

JAKARTA - YouTube will dispel misinformation related to vaccine content, not only about COVID-19. The move comes as public pressure is mounting on social media companies to stamp out dangerous misinformation that has made people hesitant for a vaccine.

"We're constantly seeing false claims about a coronavirus vaccine spreading to misinformation and we're now at a point where it's more important than ever to expand the work we started with COVID-19 to other vaccines," YouTube said.

Later, the content that YouTube removes will focus on specific diseases or false information about vaccines in general. Due to high resistance to vaccines, the United States (US) has reported more than 50,000 deaths from COVID-19 in September.

"Any content that falsely alleges that an approved vaccine is harmful and causes chronic health effects, and contains content that claims that the vaccine does not reduce disease transmission or contraction, or contains misinformation about the substance contained in the vaccine will be banned," YouTube said. as quoted from USA Today, Thursday, September 30.

The Google-owned platform said its policy would also cover videos that claim vaccines cause autism or can track recipients, and any ongoing conspiracy about vaccines. This rule applies to vaccines that are approved, confirmed safe and effective by local health authorities and the World Health Organization (WHO).

In addition, YouTube will also terminate accounts of anti-vaccine influencers such as Joseph Mercola, Erin Elizabeth, and Sherri Tenpenny. The three were identified by the Center for Countering Digital Hate among a dozen who played a major role in spreading misinformation online about the COVID-19 vaccine.

"Anti-vaxxers have used social media platforms with impunity for far too long, risking the lives of millions, if not billions, of people around the world," CEO of the Center for Countering Digital Hate said. Imran Ahmed.

NBC News reports, anti-vaccine content creators have thrived on YouTube for more than a decade, moving to the Google-owned platform after other social media outlets stopped promoting their message. Anti-vaccine content is so ubiquitous.

YouTube, like Facebook, Twitter, Reddit and many other internet companies, gives users the freedom to post their own content. However, the social media giant is thinking hard about how to balance freedom of expression with effective surveillance of bad content posted on its site.

Over the years of CNET reporting, YouTube has grappled with various types of misinformation, conspiracy theories, discrimination, hate, abuse, child exploitation and mass murder videos, all on an unprecedented global scale. YouTube critics argue that the company's content moderation efforts still often fail, or arrive too late.

For information, YouTube has removed 130,000 videos specifically for COVID vaccine misinformation since last year. The company previously claimed to also remove nearly 10 million total videos every quarter across all policy violation categories. And since February 2020, YouTube has removed more than 1 million videos related to harmful coronavirus information, such as false cures or other false claims.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)