Friday, March 29, 2024

European Commission to Release Draft Law Enforcing Mandatory Detection of Child Sexual Abuse Material on Digital Platforms

Share

The European Commission is set to release a draft law this week that could require tech companies like Apple and Google to identify, remove and report to law enforcement illegal images of child abuse on their platforms, claims a new report out today.

According to a leak of the proposal obtained by Politico, the EC believes voluntary measures taken by some digital companies have thus far “proven insufficient” in addressing the increasing misuse of online services for the purposes of sharing child sexual abuse content, which is why the commission wants to make detection of such material mandatory.

After months of lobbying, groups representing tech companies and children’s rights organizations are said to be waiting to see how stringent the rules could be, and how they will work without tech companies having to scan the gamut of user content – a practice deemed illegal by the Court of Justice of the European Union in 2016.

Apart from how identification of illegal material would operate within the law, privacy groups and tech companies are worried that the EU executive could result in the creation of backdoors to end-to-end encrypted messaging services, the contents of which cannot be accessed by the hosting platform.

The EC’s Home Affairs Commissioner Ylva Johansson has said technical solutions exist to keep conversations safe while finding illegal content, but cybersecurity experts disagree.

“The EU shouldn’t be proposing things that are technologically impossible,” said Ella Jakubowska, speaking to Politico. Jakubowska is policy adviser at European Digital Rights (EDRi), a network of 45 non-governmental organizations (NGOs.)

“The idea that all the hundreds of millions of people in the EU would have their intimate private communications, where they have a reasonable expectation that that is private, to instead be kind of indiscriminately and generally scanned 24/7 is unprecedented,” said Jakubowska.

MEPs are far from aligned on the issue, however. Reacting to the leak of the proposal, centrist Renew Europe MEP Moritz Körner told Politico the Commission’s proposal would mean “the privacy of digital correspondence would be dead.”

The heated debate mirrors last year’s controversy surrounding Apple’s plan to search for CSAM (child sexual abuse material) on iPhones and iPads.

Apple in August 2021 announced a planned suite of new child safety features, including scanning users’ iCloud Photos libraries for CSAM and Communication Safety to warn children and their parents when receiving or sending sexually explicit photos. The latter, and arguably less controversial, feature is already live on Apple’s iMessage platform. Apple’s method of scanning for CSAM has yet to have been deployed.

Following Apple’s announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), Facebook’s former security chief, politicians, policy groups, university researchers, and even some Apple employees.

The majority of criticism was leveled at Apple’s planned on-device CSAM detection, which was lambasted by researchers for relying on dangerous technology that bordered on surveillance, and derided for being ineffective at identifying images of child sexual abuse.

Apple initially attempted to dispel some misunderstandings and reassure users by releasing detailed information and sharing interviews with company executives in order to allay concerns. However, despite Apple’s efforts, the controversy didn’t go away, and Apple decided to delay the rollout of CSAM following the torrent of criticism.

Apple said its decision to delay was “based on feedback from customers, advocacy groups, researchers and others… we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

In December 2021, Apple quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads hanged in the balance following significant criticism of its methods.

However, Apple says its plans for CSAM detection have not changed since September, which suggests CSAM detection in some form is still coming in the future.Tags: European Union, European Commission, Apple Child Safety Features
This article, “European Commission to Release Draft Law Enforcing Mandatory Detection of Child Sexual Abuse Material on Digital Platforms” first appeared on MacRumors.com

Discuss this article in our forums

MacRumors-All?d=6W8y8wAjSf4 MacRumors-All?d=qj6IDK7rITs

Read more

More News