Partager:

JAKARTA - TikTok is currently being investigated by the United States (US) Department of Homeland Security regarding the flood of child sexual abuse (CSAM) and predatory material on its platform.

The investigation highlighted how TikTok can deal with content depicting child sexual abuse and the moderation controls it enforces.

The US Department of Security initially suspected that TikTok's Only Me feature was being misused to share content such as CSAM. The feature allows users to save their TikTok videos without posting them online.

Once a video's status is set as Only Me, it can only be seen by the account owner. In the case of TikTok, the video was shared with fellow bad actors or predators depicting CSAM.

As such, the video never reaches the public domain and avoids detection by TikTok's moderation system. In addition, the perpetrators also practice befriending a child online with the aim of harassing them, either online or offline.

The Financial Times reported that TikTok moderators were unable to keep up with the volume of posted videos, meaning that some abusive material has also been posted to the public feed.

TikTok is also accused of failing and unlike other social networks in detecting and preventing the perpetrators of these crimes.

"This is the perfect place for predators to meet, care for and engage children," said Erin Burke, head of the child exploitation investigation unit at Homeland Security's cybercrime division.

Burke claims that international companies like TikTok are less motivated when it comes to working with US law enforcement.

"We want (social media companies) to proactively ensure children aren't being exploited and abused on your site, and I can't say they do that, and I can say many US companies do," he added.

Predatory use of the platform is especially worrying given that TikTok is dominated by teenagers. However, the company evaded by saying it was working closely with law enforcement to resolve this issue

"TikTok has zero tolerance for child sexual abuse material. When we encounter any attempts to post, acquire, or distribute (CSAM), we remove the content, block accounts and devices, immediately report to NCMEC, and engage with law enforcement where appropriate." TikTok.

This is not the first case of an investigation into TikTok. The number of investigations by the US Department of Security covering the distribution of child exploitative content on TikTok has reportedly jumped sevenfold between 2019 and 2021.

Likewise, the BCC investigative report from 2019 revealed that predators targeted young children as young as nine years old with foul comments and content on social media owned by ByteDance, China.

For your information, TikTok is not taking it easy, the company recently implemented measures to keep its teen user base safe.

Last year, TikTok announced that foreigners would no longer be able to contact TikTok accounts belonging to children under the age of 16, and their accounts would be private. It even tightened restrictions around downloading videos posted by users under the age of 18.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)