EU Approves Draft Rules To Identify And Remove Children's Pornography In Online Services
JAKARTA - EU lawmakers on Wednesday, 22 Nevember agreed to draft a rule requiring Alphabet Google, Meta, and other online services to identify and remove child pornography online, insisting that end-to-end encryption will not be affected.
The draft rules regarding child sexual abuse (CSAM) material, proposed by the European Commission last year, have been a matter of debate between supporters of online security measures and privacy activists concerned about surveillance.
EU executives created a CSAM proposal after the current system involving voluntary detection and reporting by the company proved insufficient to protect children.
EU lawmakers must finalize with member states before the draft can become law in a process that may be completed next year.
The proposed law forces messaging services, app stores, and internet access providers to report and remove known and new images and videos, as well as groundbreaking cases.
A European Union Center on Child Sexual Harassment will be formed to act as a center of expertise and forward reports to the police.
To avoid mass surveillance, EU lawmakers strengthened detection orders to allow judicial authorities to issue orders with time limits to locate and remove CSAMs. This can only be issued if there is a reasonable reason to suspect child sexual abuse.
SEE ALSO:
Companies will also be able to select the technology used to detect such abuses, so far subject to independent and public audits.
The decision of lawmakers to exclude end-to-end encryption from the draft regulation has received praise from privacy activists.
"The position of the European Parliament removes discriminatory chat control and only allows targeted surveillance of certain individuals and groups that are plausiblely suspected of being linked to child sexual abuse material with a judicial order," said The European Liberal Youth (LYMEC).