Apple Removes Mention of CSAM Detection on Child Safety Page, Code Remains in iOS

iPhone 13 wood background

Redditor u/AsuharietYgvar spotted a change on Apple’s page that lists child safety features. The company has removed any mention of its controversial CSAM detection plans in iCloud Photos.

Update: In response to The Verge, Apple spokesperson Shane Bauer said that the company’s position hasn’t changed since September. Apple plans to move forward with the detection feature and eventually release it.

iOS 15 Child Safety

Apple announced the move in August as a feature coming in a version of iOS 15. It would detect images of child sexual abuse material (CSAM) as they are uploaded to iCloud Photos. It had to meet certain requirements such as if the image hash had been previously uploaded to a database from the National Center for Missing and Exploited Children (NCMEC).

The web page had previously said:

Update as of September 3, 2021: Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

Communication safety in Messages has still be launched with iOS 15.2. The Messages app includes tools to warn children when receiving or sending photos that contain nudity. These features are not enabled by default. If parents opt in, these warnings will be turned on for the child accounts in their Family Sharing plan.

Redditor u/AsuharietYgvar had also claimed to have extracted the NeuralHash algorithm used for the CSAM detection. They also claim that the code is still present as of iOS 15.2. It remains unknown whether Apple plans to abandon the plan entirely or release it in a future version of its operating systems.

2 thoughts on “Apple Removes Mention of CSAM Detection on Child Safety Page, Code Remains in iOS

  • Andrew:

    The global legislative fixation on encryption backdoors as an aid to law enforcement is not going away anytime soon, and child related sex-trafficking, predation and pornography will continue to serve as a cudgel against Big Tech to provide that backdoor at pain of penalty. 

    Whatever Apple’s response to this threat to user privacy, a principal feature of their platform, its success in preserving user privacy is heightened by it being proactive, where Apple can define the terms of that solution, rather than being reactive to terms set by a fragmented and mutually contradictory body of world legislators – a motley assortment unschooled and refractory to even the rudiments of modern security protocols. 

    An equal if not larger threat to Apple, however, is whether or not the company gleaned the right lessons from their botched attempt at rolling out their CSAM solution, and what specific steps they need to take in preserving, to the greatest extent possible, the trust of the greatest possible fraction of their user base, with their next move; whatever and whenever that will be. 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

WIN an iPhone 16 Pro Max!