Apple to search iPhones for child pornography, initially in the US That announcement caused a lot of fuss, and lawyers and privacy experts have questioned the plans.
At the same time, in a way it’s not very different from what other tech companies are already doing: searching the photos you upload to a cloud service for child porn. It is something that Google, Microsoft and Dropbox, among others, are already doing.
But there is one big difference: where those companies search your photos in the cloud, where you store your photos, Apple’s plans are about your phone itself looking for child pornography.
Child pornography list
Many tech companies have been scanning for child porn photos for years. This happens in more or less the same way everywhere, and it starts with a list of confirmed child pornography material.
Those are not the photos themselves: after all, it would be dubious and legally questionable to save them. Instead, it involves digital signatures of the material.
Child pornography photos and videos are mixed up with a special math function. The original child pornography material can no longer be distilled from the digital signature.
If an internet user uploads a photo to his online photo library, or if he e-mails someone else, those photos can easily be compared to the signature list.
Apple wants to do the same, but locally. Your iPhone will soon run a program that compares your photo library with the ‘child pornography list’. If a certain amount of photos or videos are classified as child pornography, they are shared with Apple, which then shares them with the authorities.
That only happens if you’ve enabled backing up your photos to iCloud; if not, your photos will not be checked.
Principle step
Although the actual operation is not much different from other cloud services, it is an important step in principle, say critics. For the first time, your phone is running software that monitors what kind of photos and videos you store.
While Apple promises to use the system exclusively for child porn, there are risks. “It is a difficult discussion, because no one is against fighting child pornography,” says lawyer Terstegge. “But the danger is that this mechanism is also used for other purposes.”
For example, a government, such as that of China, could theoretically ask Apple to ban ‘unwelcome’ photos and videos. While there’s no reason to think Apple would want to cooperate, that’s something that’s technically impossible right now, but will soon be.
Balance
Arda Gerkens, director of the online child abuse survey bureau and vice-chairman of the Senate, is positive about the system,
“It seems that Apple has been looking for a balance between privacy and protecting children,” Gerkens said. “As far as we can see, it worked.”
Because checking photos and videos locally also has a privacy advantage: it happens completely without Apple knowing about it and processing your photos. It’s a principle that Apple also applies to face and object recognition in photos. Where Google does that on its computers in data centers, iPhones do it locally.
“
Who is actually in charge of your device?
However, it remains to be seen whether this position is acceptable in the Netherlands at all, says lawyer Terstegge. “A device that you buy cannot suddenly do all kinds of things that you do not expect. Who is actually in charge of your device?”