Mark Zuckerberg, CEO of Meta, has introduced a Twitter clone app from Meta, Threads, as a "friendly" public discussion venue, which is in contrast to Twitter's more hostile nature of US billionaire Elon Musk.

"We definitely focus on hospitality and make this a friendly place," said Meta CEO Zuckerberg, on Wednesday, July 5, shortly after the launch of the app.

However, maintaining Threads' idealistic vision - which attracted more than 70 million users in the first two days - is a different story.

Meta Platforms is not a new player in managing internet crowds that often cause indecent anger and posts. The company says that new Threads app users will be subject to the same rules as those in their social media sharing photos and videos, Instagram.

The owner of Facebook and Instagram has also actively adopted an algorithmic approach in displaying content, which provides greater control over the type of content that has been successful, with efforts to further direct to entertainment and distance yourself from news.

However, by connecting Threads with other social media services such as Mastodon, and given the microblogging appeal for news fans, politicians, and adherents of the rhetoric debate, Meta is also facing new challenges with Threads and trying to find new avenues through these challenges.

"First of all, the company will not expand its existing fact-checking program to Threads," spokeswoman Christine Pai said in a written statement sent via email on Thursday, July 6. This removes Meta's distinguishing feature in managing misinformation in other applications.

Pai added that posts that were judged as misinformation by fact-checking partners on Facebook or Instagram would still have the label if posted on Threads.

When Reuters asked for an explanation of why Meta took a different approach to misinformation in Threads, Meta refused to answer.

In a New York Times podcast last Thursday, Adam Mosseri, CEO of Instagram, acknowledged that Threads supports more public discussion than any other Meta service, and therefore tends to attract users focused on the news, but the company seeks to focus on milder topics such as sport, music, fashion, and design. However, Meta's ability to distance themselves from controversy was immediately tested.

Within hours of its launch, the Threads account seen by Reuters posted about IlluMINAtes and "millionaires after demons," while other users compared each other with the Nazis and fought for everything from gender identity to violence in the West Bank.

Conservative figures, including the son of former US President Donald Trump, complained about censorship after a warning label appeared for potential followers who had posted misinformation. Another Meta spokesperson said the labels were a mistake.

Further challenges in managing content will arise when Meta connects Threads with "fediverse," where users from servers operated by other non-Meta entities will be able to communicate with Threads users. Pai from Meta said that Instagram rules would also apply to these users.

"If an account or server, or if we find multiple accounts of a given server, violates our rules, then they will be blocked from accessing Threads, which means the content of the server will not appear on Threads and vice versa," he said.

However, researchers specializing in online media say that demons exist in detail about how Meta treats those interactions.

Alex Stamos, director of the Stanford Internet Observatory and former head of security on Meta, posted on Threads that the company will face greater challenges in running key content regulatory enforcement types without access to back-up data about users posting banned content.

"With the federation, metadata used by major platforms to connect accounts with one single actor or detect abusive behavior on a large scale is not available," Stamos said. "This will make it more difficult to stop spammers, troll farms, and economically aimed users."

In his post, he said he hoped Threads would limit the appearance of fediverse servers that have many abusive accounts and impose harsher sanctions on those who post illegal material such as child pornography. However, these interactions also pose challenges.

"There are some very strange complications that arise once you start thinking about illegal things," Solomon Messing of the Center for Social Media and Politics at New York University said. He cited examples such as exploitation of children, sexually images without consent, and selling weapons.

"If you find that type of material when you index content (from other servers), do you have a responsibility outside just block it from Threads?"


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)