Thursday, April 25, 2024

Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off

Share

Apple today announced that iOS 15 and iPadOS 15 will see the introduction of a new method for detecting child sexual abuse material (CSAM) on iPhones and iPads in the United States.

User devices will download an unreadable database of known CSAM image hashes and will do an on-device comparison to the user’s own photos, flagging them for known CSAM material before they’re uploaded to iCloud Photos. Apple says that this is a highly accurate method for detecting CSAM and protecting children.

CSAM image scanning is not an optional feature and it happens automatically, but Apple has confirmed to MacRumors that it cannot detect known CSAM images if the ‌iCloud Photos‌ feature is turned off.

Apple’s method works by identifying a known CSAM photo on device and then flagging it when it’s uploaded to ‌iCloud Photos‌ with an attached voucher. After a certain number of vouchers (aka flagged photos) have been uploaded to ‌iCloud Photos‌, Apple can interpret the vouchers and does a manual review. If CSAM content is found, the user account is disabled and the National Center for Missing and Exploited Children is notified.

Because Apple is scanning ‌iCloud Photos‌ for the CSAM flags, it makes sense that the feature does not work with ‌iCloud Photos‌ disabled. Apple has also confirmed that it cannot detect known CSAM images in iCloud Backups if ‌iCloud Photos‌ is disabled on a user’s device.

It’s worth noting that Apple is scanning specifically for hashes of known child sexual abuse materials and it is not broadly inspecting a user’s photo library or scanning personal images that are not already circulating among those who abuse children. Still, users who have privacy concerns about Apple’s efforts to scan user photo libraries can disable ‌iCloud Photos‌.

Security researchers have expressed concerns over Apple’s CSAM initiative and worry that it could in the future be able to detect other kinds of content that could have political and safety implications, but for now, Apple’s efforts are limited seeking child abusers.Tag: Apple privacy
This article, “Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off” first appeared on MacRumors.com

Discuss this article in our forums

MacRumors-All?d=6W8y8wAjSf4 MacRumors-All?d=qj6IDK7rITs

Read more

More News