Recent Researchs Alleges TikTok Becomes A Nest Of Two Hazardous Contents
TikTok displays eating disorder content or eating disorders and suicide scenes to several new accounts. (fook. tiktok)

JAKARTA - A recent study shows TikTok's video recommendation algorithm featuring eating disorder content or eating disorders and suicide scenes to several new accounts.

Research conducted by the Center for Countering Digital Hate (CCDH) states, the average new account is owned by teens, and while researching, they see an account featuring suicide content in 2.6 minutes.

Another account recommends eating disorder content within eight minutes. The For You tab offers a recommended video stream to users according to the type of content they use in the app.

Social media companies say recommendations are based on a number of factors, including videos you like, follow, share, and device settings such as language preferences.

Britain's eating disorder charity, BEAT said the findings were very worrying and had asked TikTok to take immediate action to protect vulnerable users.

CCDH Research Alur

CCDH has prepared two new UK, United States (US), Canada, and Australia-based accounts. Each standard account is named after a female user and the age is set to 13 years.

The accounts of each country also include the phrase losing weight in the username section, categorized as vulnerable accounts. Researchers at CCDH analyzed the video content displayed on the For You page of each new account for 30 minutes.

They only interact with videos related to body image and mental health. As a result, it was found that standard teenage users were treated to videos related to mental health and body image every 39 seconds.

Launching Sky News, December 16, not all recommended content at this level is dangerous, and the research does not differentiate between positive content and negative content.

However, all users are presented with food disorders and suicide content, sometimes very quickly. CCDH research also found that these types of content-prone accounts are shown three times more than standard accounts, and these accounts are shown to be content that is more extreme than standard accounts.

Following CCDH's findings, TikTok is host to a food disruption content community that has garnered more than 13.2 billion views in 56 different hashtags.

Around 59.9 million of these broadcasts use hashtags containing pro-disturbation videos with high concentrations.

Even so, TikTok said the activities and experiences generated in the study did not reflect the original behavior or viewing experience of real people.

"We regularly consult with health experts, remove our policy violations, and provide access to supporting resources for anyone in need," said a TikTok spokesperson.

"We recognize that trigger content is unique to each individual and remains focused on developing a safe and comfortable space for everyone, including people who choose to share their recovery journey or educate others about this important topic."


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)