Apple recently announced some new child safety features across all of its software platforms. The new features will be released with iOS 15, iPadOS 15, watchOS 8 and macOS Monterey in the US later this year, and aim to limit the spread of child sexual abuse material (CSAM by Child Sexual Abuse Material), among others .
One of the new features will actually scan iPhone and iPad for CSAM and report them to the National Center for Missing and Exploited Children (NCMEC from the National Center for Missing and Exploited Children). Apple claims that the method of detecting known CSAM "is designed with user privacy in mind." However, the company's statement does not reassure security experts.
According to a recent publication in the Financial Times, security researchers have warned that Apple's new tool could be used for surveillance, endangering the personal information of millions of people.
Their concern is based on data shared by Apple with some US academics earlier this week. Two anonymous security researchers who followed Apple's update revealed that the proposed system (called "neuralMatch") would warn a group of people (inspectors) if it detects CSAM on an iPhone or iPad. Inspectors will then contact law enforcement if they are of course able to verify the material.
While many security researchers support Apple's efforts to curb the spread of CSAM, some have expressed concern that the tool could be misused by governments that could use it to gain access to their citizens' data.
Ross Anderson, a professor of safety engineering at Cambridge University, said:
"It's an absolutely horrific idea, because it is going to lead to distributed mass surveillance… of our phones and laptops."
Matthew Green, a professor of computer science at the Johns Hopkins Institute for Information Security, also expressed concern on Twitter:
I've had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
- Matthew Green (@matthew_d_green) August 4, 2021
But even if you think Apple will not allow the misuse of these tools… there is still a lot to worry about. These systems are based on a database of problematic "media hashes" that you, as a consumer, can not control Has Hashes use a new and proprietary neural hashing algorithm developed by Apple and made NCMEC agree to use… No we know a lot about this algorithm. What if someone can decrypt it?
Although the algorithm is currently trained to detect CSAM, it could be adapted to scan for other targeted images or texts, such as anti-government-anarchist signals, making it an extremely useful tool for authoritarian governments.