Thursday, March 28, 2024

Apple scrubs most mentions of controversial CSAM features after iOS 15.2 update

Share

Apple this week released iOS 15.2 with a host of new features, including a communication safety feature for the Messages app that’s focused on child protection. However, the company did not include its controversial iCloud Photos child sexual abuse material scanning feature and has appeared to have scrubbed all mention of its existence.

By way of recap, iOS 15.2 was originally intended to come with a child sexual abuse material (CSAM) detection feature. The update would have implemented an on-device scanning feature aimed at sweeping through a user’s iCloud photo library for CSAM material. The company noted that it would do this on-device, for maximum privacy. Privacy experts objected, noting that the very concept of scanning a user’s photo library for prohibited material was a privacy violation in itself, one that could expand to include other material once the precedent had been set.

Jesse Hollington / Digital Trends

“Previously, we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple said in a note it previously added to its original child safety page.

As spotted by MacRumors, the company has since deleted that segment from the page and any mention of the iCloud photos scanning feature. We’ve reached out to Apple for clarification on this point.

Apple also made some slight changes to communication safety in its Messages feature. Rather than automatically notifying parents if a child younger than 18 chooses to view or send a nude image via the Messages app, the company has now pulled the notification requirement in response to feedback from experts.

Instead, everyone under 18 will now be warned that the image could contain sensitive materials, and receive an explainer of what sensitive photos are and how they could be used to cause harm, rather than specifying the nature of the explicit photo itself.

Read more

More News