JAKARTA - Apple revealed plans to scan iPhones in the US for images of child sexual abuse. This certainly appeals to child protection groups. But it has also raised concerns among some security researchers that the system could be abused by governments seeking to keep tabs on their citizens.

Apple says its messaging app will use on-device machine learning to warn about sensitive content without making private communications readable by the company. A tool Apple calls "neuralMatch" will detect known child sexual abuse images without decrypting people's messages. If a match is found, the image will be reviewed by a human who can notify law enforcement if necessary.

But researchers say the tool could be used for other purposes such as government surveillance of dissidents or protesters.

Matthew Green of Johns Hopkins, a leading cryptography researcher, fears that it could be used to trap innocent people. "Researchers have been able to do this fairly easily," he said.

Tech companies including Microsoft, Google, Facebook and others have for years shared "hashed lists" of known child sexual abuse images. Apple has also scanned user files stored in its iCloud service, which are not as securely encrypted as their messages, for the images.

Some say this technology could leave companies vulnerable to political pressure in authoritarian countries like China. "What happens when the Chinese government says, 'This is a list of files that we want you to scan,'" Green said. “Did Apple say no? I hope they say no, but their technology won't say no."

The company has been under pressure from governments and law enforcement to allow surveillance of encrypted data. With the security measures Apple needs to strike a delicate balance between cracking down on the exploitation of children while maintaining its strong commitment to protecting the privacy of its users.

While Apple is confident it will succeed with the technology it develops in consultation with several prominent cryptographers, including Stanford University professor Dan Boneh, whose work in the field has won the Turing Prize, it is often called the technology's version of the Nobel Prize.

The computer scientist who more than a decade ago discovered PhotoDNA, the technology used by law enforcement to identify child pornography online, acknowledges the potential for abuse of Apple's systems but says it goes far beyond fighting child sexual abuse.

"Is it possible? Of course. But is that something I'm worried about? No,” said Hany Farid, a researcher at the University of California at Berkeley, who argues that many other programs designed to secure devices against various threats have not seen “missions of this kind creep up.” For example, WhatsApp provides users with end-to-end encryption to protect their privacy, but uses the system to detect malware and warn users not to click on malicious links.

Apple was one of the first major companies to embrace "end-to-end" encryption, in which messages are scrambled so that only the sender and recipient can read them. Law enforcement, however, has long suppressed access to that information to investigate crimes such as terrorism or child sexual exploitation.

"Apple's extended protection for children is a game changer," said John Clark, president and CEO of the National Center for Missing and Exploited Children, in a statement. "With so many people using Apple products, these new safety measures have the potential to save lives for children who are seduced online and whose horrific images are circulated in child sexual abuse material."

Julia Cordua, CEO of Thorn, said that Apple's technology balances "the need for privacy with digital security for children." Thorn, a nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifying victims and working with technology platforms.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)