JAKARTA - An appeals court in the United States on Wednesday negotiated whether TikTok's video-based social media platform could be sued for the death of a 10-year-old girl for promoting deadly "blackout" challenges that encourage people to self-toresure.

Members of a panel of three judges from the Philadelphia-based Third Region Court of Appeals noted during verbal arguments that key federal law usually protects internet companies such as TikTok from lawsuits over content posted by users.

However, some judges questioned whether Congress in adopting Section 230 of the Communications Welfare Act in 1996 could envision the growth of platforms like TikTok that not only host content but also recommend it to users using complex algorithms.

"This is unlikely to exist in the mid-1990s, or not as heavy as this is distributed as it is now," said US Environmental Judge Paul Matey.

Tawainna Anderson sued TikTok and its parent company, ByteDance from China, after her daughter, Nylah, in 2021 tried a blackout challenge by using a bag rope in her mother's cupboard. She lost consciousness, was seriously injured, and died five days later.

Anderson's lawyer, Jeffrey Goodman, told the court that although Section 230 provides legal protection to TikTok, it does not hinder claims that its products are flawed and its algorithm encourages videos about the challenge of blackouts to the child.

"This is TikTok consistently sending dangerous challenges to children who are easily influenced by the age of 10, sending off some versions of this blackout challenge, which makes them believe it's cool and fun," said Goodman.

However, TikTok's lawyer, Andrew Pincus, argues that the panel should maintain a lower court judge's decision in October 2022 which states that Section 230 prohibits Anderson's case.

Pincus warned that deciding against his client would make Section 230 "meanless" protection and open doors for lawsuits against search engines and other platforms that use algorithms to curate content for their users.

"Almost all claims can say, this is a product defect, the way the algorithm is designed," he said.

Nonetheless, US Environment Judge Patty Schwartz questioned whether the law could fully protect TikTok from "must make a decision on whether to tell someone who activated the app that there was harmful content here."

The argument comes as TikTok and other social media companies, including Meta Platforms that own Facebook and Instagram, face pressure from regulators around the world to protect children from harmful content on their platforms.

US state Attorney General is investigating TikTok to find out whether this platform causes physical or mental harm to children.

TikTok and other social media companies are also facing hundreds of lawsuits accusing them of persuading and making millions of children addicted to their platforms, damaging their mental health.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)