JAKARTA - Apple has reportedly removed the scan tool for Child Sexual Abuse Material (CSAM), which has become controversial recently.

First reported by MacRumors, Thursday, December 16, Apple has removed all mention of the scan feature on its Child Safety website. It is not clear what caused this to happen.

Even so, it's likely that Apple isn't completely serious about pulling out the CSAM scan, it's possible the company will develop it further, but with this feature recall on their site, hints that the feature rollout isn't going to happen anytime soon.

In August, Apple previously introduced a set of planned new child safety features, including scanning a user's iCloud Photo library for CSAM, Communications Security to alert children and their parents when receiving or sending sexually explicit photos, and expanding CSAM guidelines at Siri and Search.

Following the announcement, the feature was criticized by various individuals and organizations, including security researchers, privacy whistleblowers, the Electronic Frontier Foundation (EFF), Facebook's former head of security, politicians, policy groups, university researchers, and even some Apple employees themselves.

The majority of the criticism was directed at the detection of CSAM in Apple's planned devices, and was criticized by researchers for relying on malicious technology that borders on surveillance. This feature is also questionable because it is not effective in identifying child sexual abuse images.

Initially Apple tried to clear up some misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives and more, to allay concerns. However, despite Apple's efforts, the controversy still doesn't go away.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)