JAKARTA - Apple just received an open letter criticizing the company for canceling scanning photos of Mac, iPad, and iPhone users as evidence of child abuse.

The letter has been signed by privacy advocates, security experts, technology companies and legal specialists. In the letter, they said Apple's current move threatens to undermine decades of work that technologists, academics, and policy advocates have produced toward strong privacy-preserving measures becoming the norm across most consumer electronics and use cases.

"We ask Apple to reconsider the launch of its technology, so as not to cancel this important work," the letter said.

The signatories made two requests, in which the implementation of Apple's proposed content monitoring technology was immediately discontinued. Apple should issue a statement reaffirming their commitment to end-to-end encryption and user privacy.

Apple's child protection plan is a good thing, but it's also come under heavy criticism. With the release of macOS Monterey, iOS 15, and iPad OS 15, the company implemented CSAM (Child Sexual Abuse Material) detection which will check image hashes to see if any images have been abused.

The technology has been likened to creating backdoors into user files and frightening privacy experts. In a post on the Child Safety section of its website, Apple said it wanted to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of CSAM.

With that in mind the company plans to scan images sent via Messages or uploaded to iCloud for known CSAM images. Apple is introducing new child safety features in three areas, which were developed in collaboration with child safety experts.

First, new communication tools will allow parents to play a more informed role in helping their children navigate online communication. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

Furthermore, iOS and iPadOS will use new cryptographic applications to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide law enforcement with valuable information about CSAM collections in iCloud Photos.

Lastly, Siri and Search feature updates provide parents and children with more information and help if they are facing an unsafe situation. Siri and Search features will also intervene when users try to search for CSAM-related topics.

There are some worrisome steps by Apple here, one of which is misdetecting, because the hash of an image is not unique, meaning an image that is abused can have the same hash as something completely innocent.

Privacy advocates the Electronic Frontier Foundation (EFF) say this is a huge breach of privacy. Apple's plan for scanning photos uploaded to iCloud Photos is similar to Microsoft's PhotoDNA.

The difference is that the Apple scan will happen on the device. The (unauditable) database of processed CSAM images will be distributed in the OS, the processed images are changed so that the user cannot see what the image is.

This means that when the feature rolls out, a version of the NCMEC CSAM database will be uploaded to every iPhone. Once a number of photos are detected, they are sent to human reviewers at Apple, who determine that they are actually part of the CSAM database.

If confirmed by human reviewers, the photos will be sent to NCMEC, and user accounts will be deactivated. Again, the point here is that whatever the privacy and security aspects are in the technical details, all photos uploaded to iCloud will be scanned.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)