Apple Posts FAQ About its CSAM Scanning in iCloud Photos

Apple photos logo

Apple has recently shared an FAQ [PDF] about its move to detect examples of child sexual abuse material (CSAM) in users’ iCloud Photos. It attempts to answer many of the questions people may have about the change.

iCloud Photos Fingerprinting

I want to share a couple of the questions I feel are most important. Apple announced scanning in Messages as well as iCloud Photos, but these two things are entirely different from one another. Message scanning only happens under specific circumstances.

Q: What are the differences between communication safety in Messages and CSAM detection in iCloud Photos?

A: Communication safety in Messages is designed to give parents and children additional tools to help protect their children from sending and receiving sexually explicit images in the Messages app. It works only on images sent or received in the Messages app for child accounts set up in Family Sharing. It analyzes the images on-device, and so does not change the privacy assur- ances of Messages.

When a child account sends or receives sexually explicit images, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view or send the photo. As an additional precaution, young children can also be told that, to make sure they are safe, their parents will get a message if they do view it.

[…]

Communication safety in Messages is only available for accounts set up as families in iCloud. Parent/guardian accounts must opt in to turn on the feature for their family group. Parental noti- fications can only be enabled by parents/guardians for child accounts age 12 or younger.

Q: Does this mean Apple is going to scan all the photos stored on my iPhone?

A: No. By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM. The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone pho- to library on the device.

I highly recommend reading the entire thing. For example, the phrase “known CSAM” is important. It can only detect hashes of images that match hashes of known CSAM in the database from NCMEC. Your innocent photos of kids naked in a bathtub or elsewhere will not get flagged because they haven’t been part of a law enforcement investigation. And if they have, you may already be in prison anyway.

The main worry Apple customers have is the “slippery slope.” Could this type of scanning be extended to search for other types of content? Apple says it will refuse such demands from the government, but as we’ve seen in China, Apple is willing to follow local laws.

2 thoughts on “Apple Posts FAQ About its CSAM Scanning in iCloud Photos

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

WIN an iPhone 16 Pro Max!