Tech
Apple to unveil new on-device tools that identify child abuse images
Similar tech is already used by companies like Twitter, although critics say it could lead to governmental misuse.
Tomorrow, Apple will reportedly announce new software to be installed directly onto consumers’ devices that will use photo hashing to help identify potential child sexual abuse material (CSAM) stored on users’ iPhones, iPads, and MacBooks. As 9to5Mac explains, Apple products “would download a set of fingerprints representing illegal content and then check each photo in the user’s camera roll against that list. Presumably, any matches would then be reported for human review.”
Apple already employs similar machine learning tech for files uploaded to iCloud, but this would be a first for the company to use such methods on its customers’ side with actual phones and other products before cloud storage ever comes into play.
Obviously, identification and prosecution of CSAM is vital within digital ecosystems, but many critics worry this particular technology is too liable for errors, opens a door to potential governmental abuse, and is a starting point towards Apple’s intended end goal of hashing end-to-end encrypted images and files.
“The way Apple is doing this launch, they’re going to start with non-E2E photos that people have already shared with the cloud. So it doesn’t ‘hurt’ anyone’s privacy. But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal,” cryptographer and security expert, Matthew Green, tweeted. Whatever the case, if the rumor proves true, we should know more about the potential security shakeup tomorrow.
Similar technology already used in other venues — As journalist Charles Arthur notes, however, this kind of photo identification of illegal, illicit materials is nothing new. Facebook, for example, introduced similar programs in 2011, while Google has been scanning images since 2008. That said, it still doesn’t change worries regarding what will happen if companies can begin to apply this kind of machine learning to E2E messaging.
It’s easy to envision law enforcement subpoenaing a company like Apple for a customer’s data (which it can already legally obtain), then using the hashing program to identify any number of things, from illegal materials to incriminating objects and locations. Facebook is already attempting to something similar with its WhatsApp messaging service’s encrypted data—all of this hints at serious security debates on the horizon between consumers, tech companies, and governmental agencies.