Apple Sued For Not Implementing Child Abuse Content Detection System In ICloud

JAKARTA - The tech giant Apple is reportedly being sued for not implementing a photo detection system to scan child abuse (CSAM) material in iCloud.

Based on a report from the New York Times, the lawsuit filed under the pseudonym by the 27-year-old said that Apple's decision, which does not have a scanner system, forced the victims to experience their trauma again.

In fact, in 2021 Apple has announced the system, and already has digital signatures from the National Center for Missing and Exploited Children as well as other groups to detect known CSAM content in the user's iCloud library.

But unfortunately, according to TechCrunch, the plan appears to have been abandoned after security and privacy advocates suggested that the system could create loopholes for government oversight.

In her lawsuit, the woman said that a relative abused her when she was a baby and shared pictures of herself online. She also admitted that she received notifications from law enforcement almost every day about someone charged with having the pictures.

James Marsh's lawyer, who was involved in the lawsuit, said there was a potential group of 2,680 victims who were entitled to compensation in this case.

One Apple spokesperson has also made comments regarding the lawsuit to the New York Times, that they are "mediately and actively innovating against this crime without compromising the security and privacy of all of our users."