YouTube Changes Election Misinformation Policy, Will Show Controversial Content
JAKARTA - YouTube has just changed its misinformation policy, where they will not delete controversial videos or based on unproven assumptions. The company's move comes ahead of next year's US Presidential Election.
"Given that, and with the 2024 campaign going well, we will stop removing content promoting false claims that widespread fraud, error, or interference occurred in 2020 and previous US Presidential elections. This is taking effect today, Friday, June 2", said YouTube in a statement quoted on Saturday, June 3.
The Alphabet streaming platform stated that by allowing these videos to remain very important for democratic communities because it allows them to discuss and debate openly.
YouTube first made its election misinformation policy in December 2020. And after two years of creation, the company claimed to have removed tens of thousands of videos.
"We recognize it's time to re-evaluate the impact of the policy in the changing current conditions. In today's environment, we find that even though removing the content does reduce some misinformation, it can also have unwanted effects in the form of limiting political speeches without reducing the risk of violence or other real-world dangers meaningfully," said YouTube.
Although the misinformation policy has changed, YouTube says it will continue to present content about elections based on valid sources.
"We ensure that when people open YouTube to look for news and information about elections, they will see content from official sources strikingly at searches and recommendations," YouTube said.
Like after the 2020 US elections, where YouTube found videos from official sources such as news outlets representing the most-viewed and most recommended election videos on YouTube.
"All of our election misinformation policies remain in effect, including those that prohibit content aimed at misleading voters about timing, place, means, eligibility requirements to vote, false claims that can materially prevent voting, including those that dispute the validity of voting through letters and content that encourages others to interfere with the democratic process," explained YouTube.
However, while YouTube is pushing to limit the spread of misinformation during the 2020 US presidential election, its method has not always proven effective.
A study in 2020, found YouTube's recommendation algorithms often display videos questioning election results to skeptical users.
Soon, the platform finally faced criticism for leaving the video with false election claims. YouTube also lifted restrictions imposed on Donald Trump's account earlier this year.