JAKARTA A federal appeals court in the United States ruled that social media platform X (previously known as Twitter) should face some lawsuits accusing the company of negligence in handling videos of child sexual exploitation on its platform.

This ruling was issued by the 9th US Court of Appeals in San Francisco on Friday, August 1, which ruled that although X is widely protected by the Communications Conduct Act (Communications Decency Act) Article 230 which provides immunity to the content posted by users of the platform remains responsible for allegations of negligence after they become aware of the existence of the explicit content.

This case began before Elon Musk acquired Twitter in 2022. The lawsuit was previously rejected by a first-degree court judge in December 2023, but has now been revived by an appeals court decision. Elon Musk himself was not a defendant in this case, and legal representatives X have yet to comment.

In the lawsuit, the plaintiffs, referred to as John Doe 1 and John Doe 2, stated that when they were 13 years old, they were deceived by a SnapChat user who claimed to be a 16-year-old girl from their school.

The user is actually a child pornography trading actor who forces them to send more explicit images with the threat of extortion. The images were then compiled into a video and uploaded to Twitter.

According to court documents, the video was watched more than 167,000 times before it was finally deleted by Twitter nine days after they received a report on the content. The platform is also said to be slow in reporting the case to the National Center for Missing and Exploited Children (NCMEC), which is legally mandatory after receiving related information.

Judge Danielle Forrest stated in his ruling that Article 230 does not provide protection against negligence when the platform has actual knowledge of child pornography content. "The facts submitted, coupled with the legal 'actual knowledge' requirement, separate the obligation to report to NCMEC from Twitter's role as content issuer," he wrote in a three-guest panel decision.

Another lawsuit that X also still has to face is the accusation that the platform's infrastructure makes it difficult for users to report child pornography content. However, the court ruled that X cannot be prosecuted on charges of taking advantage of sexual trafficking or search features that are said to strengthen the spread of the illegal content.

Dani Pinter, a lawyer from the National Center on Sexual Exploitation who represented the plaintiffs, welcomed the decision. "We are looking forward to the process of disclosing evidence and ultimately the trial of X to seek justice and accountability," he said in an official statement.

This case highlights the growing legal pressure on digital platforms, especially in terms of protection against children and handling illegal content spread by users.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)