JAKARTA - WhatsApp will not adopt Apple's steps to create a Safety for Children feature. This feature is intended to stop the spread of child abuse images.
In a Twitter thread, WhatsApp head Will Cathcart explained his belief that Apple "has built software that can scan all the private photos on your phone," and said that Apple had gone the wrong way in trying to improve its response to child sexual abuse material, or CSAM.
Apple's plan, announced Thursday, August 5, involves taking hashes of images uploaded to iCloud and comparing them to a database containing known CSAM image hashes.
According to Apple, this allows it to store encrypted user data and run analyzes on the device while still allowing it to report users to authorities if they are found sharing child abuse images.
Another branch of Apple's Child Safety strategy involves an optional warning to parents if their child under 13 sends or views a photo that contains sexually explicit content. An internal memo at Apple acknowledged that people would be "worried about the implications" of the system.
I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world.People have asked if we'll adopt this system for WhatsApp. The answer is no.
— Will Cathcart (@wcathcart) August 6, 2021
Cathcart called Apple's approach "deeply concerning", saying it would allow governments with differing ideas about what types of images are and are unacceptable to ask Apple to add non-CSAM images to the database that compares images.
Cathcart said WhatsApp's system to combat child exploitation, which partly leverages user reports, maintains encryption like Apple's and has led the company to report more than 400,000 cases to the National Center for Missing and Exploited Children in 2020. Apple is also working with the Center for detection efforts. CSAM.
Apple plans to modify iPhones to constantly scan for contraband: “It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops,” said Ross Anderson, professor of security engineering. https://t.co/rS92HR3pUZ
— Edward Snowden (@Snowden) August 5, 2021
WhatsApp owner Facebook has reason to pounce on Apple over privacy concerns. Apple's changes to how ad tracking works in iOS 14.5 started a fight between the two companies. Facebook criticized Apple's privacy changes as dangerous for small businesses. Apple hit back, saying that the change "only requires" that users be given the choice of whether to be tracked.
It's not just WhatsApp that has criticized Apple's new Child Safety measures. The list of people and organizations raising concerns includes Edward Snowden, the Electronic Frontier Foundation, professors, and more.
Matthew Green, a professor at Johns Hopkins University, pushed back on the feature before it was publicly announced. He tweeted about Apple's plans and about how the hashing system could be abused by governments and bad actors.
It's atrocious how Apple vacuums up everybody's data into iCloud by default, hides the 15+ separate options to turn parts of it off in Settings underneath your name, and forces you to have an unwanted email account. Apple would NEVER allow a third party to ship an app like this.
— Tim Sweeney (@TimSweeneyEpic) August 6, 2021
The EFF released a statement condemning Apple's plan, more or less calling it "a backdoor that is thoroughly documented, carefully thought out, and narrowly in scope." The EFF press release goes into detail about how they believe Apple's Child Safety measures can be abused by governments and how they undermine user privacy.
Kendra Albert, an instructor at Harvard's Cyberlaw Clinic, has a thread about the potential dangers to queer children and Apple's initial lack of clarity around age ranges for the parent notification feature.
Edward Snowden retweeted a Financial Times article about the system, giving his own characterization of what Apple does.
Politician Brianna Wu called the system "the worst idea in Apple's history."
It's atrocious how Apple vacuums up everybody's data into iCloud by default, hides the 15+ separate options to turn parts of it off in Settings underneath your name, and forces you to have an unwanted email account. Apple would NEVER allow a third party to ship an app like this.
— Tim Sweeney (@TimSweeneyEpic) August 6, 2021
Author Matt Blaze also tweeted about concerns that technology could be misused by governments that go beyond the borders, trying to prevent content other than CSAM.
Epic CEO Tim Sweeney also criticized Apple, saying that the company "sucks everyone's data into iCloud by default." He also promised to share more thoughts specifically about Apple's Child Safety system.
I believe in privacy - including for kids whose sexual abuse is documented and spread online without consent. These efforts announced by @Apple are a major step forward in the fight to eliminate CSAM from the internet. https://t.co/TQIxHlu4EX
— ashton kutcher (@aplusk) August 5, 2021
However, not every reaction is critical. Ashton Kutcher, who has been doing advocacy work to end the child sex trade since 2011, called Apple's work a "huge step forward" to efforts to eliminate CSAM.
The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)