Redditor u/AsuharietYgvar spotted a change on Apple’s page that lists child safety features. The company has removed any mention of its controversial CSAM detection plans in iCloud Photos.
Update: In response to The Verge, Apple spokesperson Shane Bauer said that the company’s position hasn’t changed since September. Apple plans to move forward with the detection feature and eventually release it.
iOS 15 Child Safety
Apple announced the move in August as a feature coming in a version of iOS 15. It would detect images of child sexual abuse material (CSAM) as they are uploaded to iCloud Photos. It had to meet certain requirements such as if the image hash had been previously uploaded to a database from the National Center for Missing and Exploited Children (NCMEC).
The web page had previously said:
Update as of September 3, 2021: Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.
Communication safety in Messages has still be launched with iOS 15.2. The Messages app includes tools to warn children when receiving or sending photos that contain nudity. These features are not enabled by default. If parents opt in, these warnings will be turned on for the child accounts in their Family Sharing plan.
Redditor u/AsuharietYgvar had also claimed to have extracted the NeuralHash algorithm used for the CSAM detection. They also claim that the code is still present as of iOS 15.2. It remains unknown whether Apple plans to abandon the plan entirely or release it in a future version of its operating systems.
Apple did the right thing on what no one is properly reporting on. For the nude photos in iPhone feature, it gave parents a check box to opt in (or not) for looking after their kids. It gives parents the power to do this. Well done on this part:
https://support.apple.com/en-us/HT212850
Now lets hope they kill their Orwell’s nightmare CSAM fiasco.
Andrew:
The global legislative fixation on encryption backdoors as an aid to law enforcement is not going away anytime soon, and child related sex-trafficking, predation and pornography will continue to serve as a cudgel against Big Tech to provide that backdoor at pain of penalty.
Whatever Apple’s response to this threat to user privacy, a principal feature of their platform, its success in preserving user privacy is heightened by it being proactive, where Apple can define the terms of that solution, rather than being reactive to terms set by a fragmented and mutually contradictory body of world legislators – a motley assortment unschooled and refractory to even the rudiments of modern security protocols.
An equal if not larger threat to Apple, however, is whether or not the company gleaned the right lessons from their botched attempt at rolling out their CSAM solution, and what specific steps they need to take in preserving, to the greatest extent possible, the trust of the greatest possible fraction of their user base, with their next move; whatever and whenever that will be.