Apple will be scanning everyone’s images for “your safety.”
On Thursday, Apple announced a series of changes that it says are designed to better protect children. In a sense, the changes represent a noble effort on Apple’s part to address what is a very real problem–the sexual exploitation of minors. I think that’s a fight we can all get behind.
At the same time, the changes represent the most significant shift in the promise Apple makes its users about how it treats their data. The biggest change is that when you upload images to iCloud Photos, Apple will now analyze images to determine if they match known child sexual abuse material (CSAM).
According to Apple’s documentation, “Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the database of known CSAM hashes.” Basically, your iPhone will analyze images when you upload them to iCloud using a technology that converts the image to a mathematical hash and compares it against a database of known exploitative content. No one can actually see the image, and the content remains private, but it can be compared with those in the database.
If there’s a match, the image is flagged and reviewed manually. If the amount of content in an iCloud account reaches a certain threshold, it’s reported to the National Center for Missing and Exploited Children. For security reasons, Apple doesn’t say what that threshold is.