Apple’s NeuralHash algorithm it will use to detect child sexual abuse material (CSAM) has been extracted from a device and rebuilt using Python.
NeuralHash
Redditor u/AsuharietYgvar posted on Tuesday about their accomplishment. They say the algorithm can be found on devices as early as iOS 14.3. To be clear, this is not the CSAM detection system, but the algorithm used to generate hashes. The code is available on GitHub.
Here are the steps taken by the algorithm:
- Convert image to RGB.
- Resize image to 360×360.
- Normalize RGB values to [-1, 1] range.
- Perform inference on the NeuralHash model.
- Calculate dot product of a 96×128 matrix with the resulting vector of 128 floats.
- Apply binary step to the resulting 96 float vector.
- Convert the vector of 1.0 and 0.0 to bits, resulting in 96-bit binary data.
The model will run on macOS and Linux.