Apple will scan devices for child pornography

Apple recently announced some new child safety features across its software platforms. The new features will be released with the operating systems iOS 15, iPadOS 15, 8 and macOS Monterey in the US later this year, and are aimed at curbing the spread of Child Sexual Abuse Material (CSAM), among other things.

institutional child abuse

Μία από τις νέες δυνατότητες ουσιαστικά θα σαρώνει iPhone και iPad για CSAM και θα τα αναφέρει στο Εθνικό Κέντρο για Αγνοούμενα και Εκμεταλλευόμενα Παιδιά (NCMEC από το Center for Missing and Exploited Children). Η Apple ισχυρίζεται ότι η μέθοδος ανίχνευσης γνωστού CSAM “έχει σχεδιαστεί με γνώμονα την ιδιωτικότητα του χρήστη”. Ωστόσο η δήλωση της εταιρείας δεν καθησυχάζει τους ειδικούς ασφαλείας.

According to a recent publication in the Financial Times, security researchers have warned that Apple's new tool could be used for surveillance, endangering the personal information of millions of people.

Their concern is based on data shared by Apple with some US academics earlier this week. Two anonymous security researchers who followed Apple's update revealed that the proposed system (called "neuralMatch") would warn a group of people (inspectors) if it detects CSAM on an iPhone or iPad. Inspectors will then contact law enforcement if they are of course able to verify the material.

While many security researchers support Apple's efforts to curb the spread of CSAM, some have expressed concern that the tool could be misused by governments that could use it to gain access to their citizens' data.

Ross Anderson, a professor of safety engineering at Cambridge University, said:

"It's an absolutely horrifying idea because it's going to lead to distributed mass surveillance ... of phones and laptops us".

Matthew Green, a professor of computer science at the Johns Hopkins Institute for Information Security, also expressed concern on Twitter:

But even if you think Apple will not allow the misuse of these tools… there is still a lot to worry about. These systems are based on a database of problematic "media hashes" that you, as a consumer, can not control Has Hashes use a new and proprietary neural hashing algorithm developed by Apple and made NCMEC agree to use… No we know a lot about this algorithm. What if someone can decrypt it?

Although the algorithm is currently trained to detect CSAM, it could be adapted to scan for other targeted images or texts, such as anti-government-anarchist signals, making it an extremely useful tool for authoritarian governments.

iGuRu.gr The Best Technology Site in Greecefgns

every publication, directly to your inbox

Join the 2.087 registrants.
child pornography, apple, iguru, iguru.gr

Written by giorgos

George still wonders what he's doing here ...

Leave a reply

Your email address is not published. Required fields are mentioned with *

Your message will not be published if:
1. Contains insulting, defamatory, racist, offensive or inappropriate comments.
2. Causes harm to minors.
3. It interferes with the privacy and individual and social rights of other users.
4. Advertises products or services or websites.
5. Contains personal information (address, phone, etc.).