EU Proposes New Regulation Requiring Apple and Other Tech Giants to Detect Child Sexual Abuse Material

child sexual abuse detection

The European Commission has proposed new legislation to prevent and combat child sexual abuse online. Once approved, the new EU regulation will require Apple and other tech giants to detect, report, and remove child sexual abuse material (CSAM) on their services.

Voluntary Detection Not Enough to Stop Child Sexual Abuse Online

According to the European Commission, the proposed legislation will prevent and combat child sexual abuse online. The Commission noted there were 85 million pictures and videos depicting child sexual abuse in 2021 alone. The COVID-19 pandemic in the last two years had made this problem worse. The Internet Watch Foundation reported a 64% increase in confirmed CSAM.

The Commission said that the current system of voluntarily detecting and reporting developed by companies was no longer adequate to protect children. Hence, to address the issue effectively, the Commission proposed the new regulation. The proposed rules will require companies to detect, report, and remove CSAM on their services.

Apple’s CSAM Troubles Return

As mentioned, the proposed EU regulation will affect tech giants including Apple. In August 2021, Apple announced new child safety features. These include scanning iCloud photos libraries for CSAM. As part of its communication safety feature, Apple also deployed automatic notification to children and parents when receiving or sending sensitive photos via iMessage.

However, Apple’s plan to scan iCloud photos didn’t fully materialize due to widespread criticism. Many said that it would affect users’ privacy. So Apple postponed its roll-out. In December 2021, it stopped mentioning CSAM detection on its Child Safety webpage.

Proposed EU Regulation and Apple’s CSAM Woes

One of the rules in the proposed legislation mandates that companies use the least privacy-intrusive technology in detecting CSAM. However, the rule did not specify how to achieve this. To what extent can companies do detection without being branded as doing personal surveillance, hence violating user privacy? Apple’s experience indicates that employing detection technology would be hard to push. It remains to be seen whether the proposed EU legislation could override the privacy issue to stop the proliferation of CSAM.

One thought on “EU Proposes New Regulation Requiring Apple and Other Tech Giants to Detect Child Sexual Abuse Material

  • Arnold:

    ‘One of the rules in the proposed legislation mandates that companies use the least privacy-intrusive technology in detecting CSAM. However, the rule did not specify how to achieve this.’

    Of course it didn’t. Perish the thought that any of the mandates from the EU should actually provide guidance on how to achieve simultaneous compliance and preserving extant privacy protocols. Brussels (or for that matter, legislatures world – over) are simply ‘idea guys’; you know, folks who come up with the big ideas. They leave the small stuff, like how to make these big ideas work, such as harmonising messaging platforms, to the little people. 

    How hard could it be? After all, were these things truly difficult, and worthy of higher thought and expertise, why, these little people would have ascended to membership in parliaments and congresses, like the ‘big ideas’ people, would they not?

    It’ll all be fine. You’ll see. 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

WIN an iPhone 16 Pro Max!