JAKARTA - WhatsApp will not adopt Apple's steps to create a Safety for Children feature. This feature is intended to stop the spread of child abuse images.

In a Twitter thread, WhatsApp head Will Cathcart explained his belief that Apple "has built software that can scan all the private photos on your phone," and said that Apple had gone the wrong way in trying to improve its response to child sexual abuse material, or CSAM.

Apple's plan, announced Thursday, August 5, involves taking hashes of images uploaded to iCloud and comparing them to a database containing known CSAM image hashes.

According to Apple, this allows it to store encrypted user data and run analyzes on the device while still allowing it to report users to authorities if they are found sharing child abuse images.

Another branch of Apple's Child Safety strategy involves an optional warning to parents if their child under 13 sends or views a photo that contains sexually explicit content. An internal memo at Apple acknowledged that people would be "worried about the implications" of the system.

Cathcart called Apple's approach "deeply concerning", saying it would allow governments with differing ideas about what types of images are and are unacceptable to ask Apple to add non-CSAM images to the database that compares images.

Cathcart said WhatsApp's system to combat child exploitation, which partly leverages user reports, maintains encryption like Apple's and has led the company to report more than 400,000 cases to the National Center for Missing and Exploited Children in 2020. Apple is also working with the Center for detection efforts. CSAM.

WhatsApp owner Facebook has reason to pounce on Apple over privacy concerns. Apple's changes to how ad tracking works in iOS 14.5 started a fight between the two companies. Facebook criticized Apple's privacy changes as dangerous for small businesses. Apple hit back, saying that the change "only requires" that users be given the choice of whether to be tracked.

It's not just WhatsApp that has criticized Apple's new Child Safety measures. The list of people and organizations raising concerns includes Edward Snowden, the Electronic Frontier Foundation, professors, and more.

Matthew Green, a professor at Johns Hopkins University, pushed back on the feature before it was publicly announced. He tweeted about Apple's plans and about how the hashing system could be abused by governments and bad actors.

The EFF released a statement condemning Apple's plan, more or less calling it "a backdoor that is thoroughly documented, carefully thought out, and narrowly in scope." The EFF press release goes into detail about how they believe Apple's Child Safety measures can be abused by governments and how they undermine user privacy.

Kendra Albert, an instructor at Harvard's Cyberlaw Clinic, has a thread about the potential dangers to queer children and Apple's initial lack of clarity around age ranges for the parent notification feature.

Edward Snowden retweeted a Financial Times article about the system, giving his own characterization of what Apple does.

Politician Brianna Wu called the system "the worst idea in Apple's history."

Author Matt Blaze also tweeted about concerns that technology could be misused by governments that go beyond the borders, trying to prevent content other than CSAM.

Epic CEO Tim Sweeney also criticized Apple, saying that the company "sucks everyone's data into iCloud by default." He also promised to share more thoughts specifically about Apple's Child Safety system.

However, not every reaction is critical. Ashton Kutcher, who has been doing advocacy work to end the child sex trade since 2011, called Apple's work a "huge step forward" to efforts to eliminate CSAM.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)