Friday, April 19, 2024

Apple Remains Silent About Plans to Detect Known CSAM Stored in iCloud Photos

Share

It has now been over a year since Apple announced plans for three new child safety features, including a system to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, an option to blur sexually explicit photos in the Messages app, and child exploitation resources for Siri. The latter two features are now available, but Apple remains silent about its plans for the CSAM detection feature.

Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company ultimately postponed the feature based on “feedback from customers, advocacy groups, researchers, and others.”

In September 2021, Apple posted the following update to its Child Safety page:

Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

In December 2021, Apple removed the above update and all references to its CSAM detection plans from its Child Safety page, but an Apple spokesperson informed The Verge that Apple’s plans for the feature had not changed. To the best of our knowledge, however, Apple has not publicly commented on the plans since that time.

We’ve reached out to Apple to ask if the feature is still planned. Apple did not immediately respond to a request for comment.

Apple did move forward with implementing its child safety features for the Messages app and Siri with the release of iOS 15.2 and other software updates in December 2021, and it expanded the Messages app feature to Australia, Canada, New Zealand, and the UK with iOS 15.5 and other software releases in May 2022.

Apple said its CSAM detection system was “designed with user privacy in mind.” The system would perform “on-device matching using a database of known CSAM image hashes” from child safety organizations, which Apple would transform into an “unreadable set of hashes that is securely stored on users’ devices.”

Apple planned to report iCloud accounts with known CSAM images to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple said there would be a “threshold” that would ensure “less than a one in one trillion chance per year” of an account being incorrectly flagged by the system, plus a manual review of flagged accounts by a human.

Apple’s plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.

Some critics argued that Apple’s child safety features could create a “backdoor” into devices, which governments or law enforcement agencies could use to surveil users. Another concern was false positives, including the possibility of someone intentionally adding CSAM imagery to another person’s iCloud account to get their account flagged.Tag: Apple Child Safety Features
This article, “Apple Remains Silent About Plans to Detect Known CSAM Stored in iCloud Photos” first appeared on MacRumors.com

Discuss this article in our forums

Read more

More News