JAKARTA In 2021, Apple launched the latest safety features that include the detection of Child Sexual Harassment Material (CSAM) in the photo iCloud. Before its release, this feature received a lot of criticism.
During the testing period, many users and developers have criticized the feature for privacy concerns. After its feature was delayed indefinitely, Apple finally stopped rolling out the CSAM detection system on iCloud services.
It turns out that the cancellation of this launch also caused a dilemma for Apple. The reason is, the company must face lawsuits from thousands of CSAM victims. This lawsuit was filed because Apple failed to launch a child pornography detection and reporting system or CSAM.
According to Arstechnica's report, the plaintiffs claimed that Apple deliberately used cybersecurity defenses to ignore CSAM's reporting obligations. The company is also considered to be taking advantage of the policies they are currently implementing.
VOIR éGALEMENT:
With the absence of a CSAM detection system in iCloud, Apple is considered to have given permission to predators to take advantage of iCloud. This is because predators can store CSAMs that have been reported by most technology companies.
The reporters also explained that Apple had only reported 267 cases of CSAM over the past year, whereas four other major technology companies submitted more than 32 million reports. This finding could be burdensome for Apple's position in the trial.
If Apple loses a jury trial, they may be fined more than 1.2 billion US dollars (Rp19 trillion). Apple could also be forced to re-enter features that can identify, remove, and report CSAM stored in iCloud.
The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)