Apple recently announced some new child safety features across its software platforms. The new features will be released with the operating systems iOS 15, iPadOS 15, WATCH 8 and macOS Monterey in the US later this year, and are aimed at curbing the spread of Child Sexual Abuse Material (CSAM), among other things.
One of the new features will essentially scan iPhones and iPads for CSAM and report them to the National Center for Missing and Exploited Children (NCMEC from the National Center for Missing and Exploited Children). Η Apple ισχυρίζεται ότι η μέθοδος ανίχνευσης γνωστού CSAM “έχει σχεδιαστεί με γνώμονα την ιδιωτικότητα του χρήστη”. Ωστόσο η δήλωση της εταιρείας δεν καθησυχάζει τους ειδικούς security.
According to a recent publication in the Financial Times, security researchers have warned that Apple's new tool could be used for surveillance, endangering the personal information of millions of people.
Their concern is based on data Apple shared with some US academics earlier this week. Two unnamed security researchers who watched her information of Apple revealed that the proposed system (called “neuralMatch”) will proactively warn a group of people (inspectors) if it detects CSAM on an iPhone or iPad. Inspectors will then contact law enforcement if of course they are able to verify the material.
While many security researchers support Apple's efforts to curb the spread of CSAM, some have expressed concern that the tool could be misused by governments that could use it to gain access to their citizens' data.
Ross Anderson, a professor of safety engineering at Cambridge University, said:
"It's an absolutely horrific idea, because it is going to lead to distributed mass surveillance… of our phones and laptops."
Matthew Green, a professor of computer science at the Johns Hopkins Institute for Information Security, also expressed concern on Twitter:
I've had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
- Matthew Green (@matthew_d_green) August 4
But even if you think Apple won't allow these tools to be abused… there's still a lot to worry about. These systems are based on a database of problematic “media hashes” that you, as a consumer, cannot control … Hashes use a new and proprietary neural hashing algorithm that Apple has developed and got NCMEC to agree to use … We don't know much about this algorithm. What if someone can decipher it?
Although the algorithm is currently trained to detect CSAM, it could be adapted to scan for other targeted images or texts, such as anti-government-anarchist signals, making it an extremely useful tool for authoritarian governments.