Apple will scan devices for child pornography

Apple recently announced some new child safety features across its software platforms. The new features will be released with the operating systems iOS 15, iPadOS 15, 8 and macOS Monterey in the US later this year, and are aimed at curbing the spread of Child Sexual Abuse Material (CSAM), among other things.

institutional child abuse

One of the new features will essentially scan iPhones and iPads for CSAM and report them to the National Center for Missing and Exploited Children (NCMEC from the National for Missing and Exploited Children). Η Apple ισχυρίζεται ότι η μέθοδος ανίχνευσης γνωστού CSAM “έχει σχεδιαστεί με γνώμονα την ιδιωτικότητα του χρήστη”. Ωστόσο η δήλωση της εταιρείας δεν καθησυχάζει τους ειδικούς .

According to a recent publication in the Financial Times, security researchers have warned that Apple's new tool could be used for surveillance, endangering the personal information of millions of people.

Their concern is based on data Apple shared with some US academics earlier this week. Two unnamed security researchers who watched her of Apple revealed that the proposed system (called “neuralMatch”) will proactively warn a group of people (inspectors) if it detects CSAM on an iPhone or iPad. Inspectors will then contact law enforcement if of course they are able to verify the material.

While many security researchers support Apple's efforts to curb the spread of CSAM, some have expressed concern that the tool could be misused by governments that could use it to gain access to their citizens' data.

Ross Anderson, a professor of safety engineering at Cambridge University, said:

"It's an absolutely horrific idea, because it is going to lead to distributed mass surveillance… of our phones and laptops."

Matthew Green, a professor of computer science at the Johns Hopkins Institute for Information Security, also expressed concern on Twitter:

But even if you think Apple won't allow these tools to be abused… there's still a lot to worry about. These systems are based on a database of problematic “media ” that you, as a consumer, cannot control … Hashes use a new and proprietary neural hashing algorithm that Apple has developed and got NCMEC to agree to use … We don't know much about this algorithm. What if someone can decipher it?

Although the algorithm is currently trained to detect CSAM, it could be adapted to scan for other targeted images or texts, such as anti-government-anarchist signals, making it an extremely useful tool for authoritarian governments.

iGuRu.gr The Best Technology Site in Greecefgns

every publication, directly to your inbox

Join the 2.087 registrants.
child pornography, apple, iguru, iguru.gr

Written by giorgos

George still wonders what he's doing here ...

Leave a reply

Your email address is not published. Required fields are mentioned with *

Your message will not be published if:
1. Contains insulting, defamatory, racist, offensive or inappropriate comments.
2. Causes harm to minors.
3. It interferes with the privacy and individual and social rights of other users.
4. Advertises products or services or websites.
5. Contains personal information (address, phone, etc.).