Much controversy has come from Apple’s announcement that it will begin scanning your photographs in iPhoto for child sexual abuse material. What wasn’t known, until now, is that Apple already has similar technology in use on another service. Since at least 2019, Apple has been scanning iCloud mail for CSAM attachments.
Questioning Apple’s Claim That It Is the Greatest Platform for Distributing Child Porn
Ben Lovejoy, from 9to5Mac, spoke recently with Eric Friedman. Friedman is Cupertino’s chief of anti-fraud. In the course of that discussion, the executive stated that Apple was “the greatest platform for distributing child porn.” That made Lovejoy wonder, how could Apple know this if it wasn’t already scanning our photos?
Another clue was from an archived version of Apple’s Child Safety Page. In the old version of the child safety page, Apple states it “uses image matching technology to help find and report child exploitation”. This technology, according to Cupertino, works like many spam filters. The servers use an electronic signature to identify suspected CSAM.
In January 2020, Jane Horvath, Apple’s chief privacy officer, confirmed that the company uses screening technology to look for CSAM. At a tech conference, Horvath said “the company uses screening technology to look for the illegal images. The company says it disables accounts if Apple finds evidence of child exploitation material, although it does not specify how it discovers it”.
Apple Is Already Scanning iCloud Mail for CSAM
All of this led Lovejoy to question how Cupertino would know it was such a prolific platform for spreading child porn. While Apple would not comment on Friedman’s quote, the company did inform Lovejoy that it has never (yet) scanned iCloud Photos. However, Cupertino pointed out that it has been scanning both incoming and outgoing iCloud mail for these attachments since 2019. Since email is not encrypted, scanning the attachments as they go through Apple’s email servers would be pretty easy.
Our own Andrew Orr pointed this out earlier in August 2021, but it definitely bears repeating. Apple isn’t the stanchion for personal privacy that it once was.
When you are distributing illegal material you forfeit your rights to privacy. IMO. Perpetrators and distributors of such material need to think about the violation of their victims rights.
When you are distributing illegal material you forfeit your rights to privacy. IMO.
Nice take Jeff. It begs the question…
https://www.macobserver.com/columns-opinions/devils-advocate/csam-question-no-one-is-asking-apple/?utm_campaign=tmo_home_sidebar