Apple Privacy Chief Tries to Reduce Concerns About CSAM And Messages Safety Features

Siri and search child safety

Apple’s head of privacy, Erik Neuenschwander tried to explain the company’s stance, in light of the recent outcry regarding its proposals on countering the spreading of Child Sexual Abuse Material (CSAM) and new safety features for Siri and Messages. Speaking to Techruch, he insisted that “we’ve now got the technology that can balance strong child safety and user privacy.”

Scanning For CSAM and Protecting Privacy

In one key exchange regarding scanning for CSAM, Mr. Neuenschwander said:

The system as designed doesn’t reveal — in the way that people might traditionally think of a match — the result of the match to the device or, even if you consider the vouchers that the device creates, to Apple. Apple is unable to process individual vouchers; instead, all the properties of our system mean that it’s only once an account has accumulated a collection of vouchers associated with illegal, known CSAM images that we are able to learn anything about the user’s account.

He also insisted that “we’re going to leave privacy undisturbed for everyone not engaged in the illegal activity,” and reiterated:

The device is still encrypted, we still don’t hold the key, and the system is designed to function on on-device data.

When quizzed on sharing information with governments, he emphasized that the new detection system was only being rolled out in the U.S.

New Feature in Message

On the changes to Messages, Mr. Neuenschwander said:

Communication Safety in Messages takes place entirely on the device and reports nothing externally — it’s just there to flag to a child that they are or could about to be viewing explicit images. This feature is opt-in by the parent and transparent to both parent and child that it is enabled.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

WIN an iPhone 16 Pro Max!