When Apple “sacrifices” a little privacy to better fight against child pornography

Apple unveiled on Thursday new tools to more effectively identify child pornography content on its devices and services. But some computer security experts fear that this device opens the Pandora’s box of mass surveillance.

The famous apple brand, which has built part of its reputation on the protection of personal data, unveiled a new device, Thursday August 5, which shook the backs of leading online privacy experts.

Called “NeuralMatch”, this tool pursues a goal impossible to criticize: to fight against child pornography. Apple, in an update first deployed in the United States next month, plans to use artificial intelligence and cryptography to scan photos stored locally on iPhones and iPads to identify if any fall under child pornography.

Big Brother respectful of privacy?

The American group can already control all the images that are on iCloud, the backup service on Apple’s remote servers. This is also what its main competitors, such as Google, Dropbox or Microsoft’s OneDrive, are doing.

But the creator of the iPhone goes much further since he is preparing to analyze the snapshots that users have not yet decided to put on iCloud servers. “Apple’s initiative can be a game-changer,” said John Clark, president of the National Center for Missing and Exploited Children (NCEM), America’s leading anti-exploitation organization, said. interviewed by the Wall Street Journal.

This tool allows Apple to make users of its devices understand that they cannot have any secret garden, even in the darkest recesses of their smartphones. However, this new Big Brother would be as respectful of privacy as possible. The algorithm does not look at the images. It reduces each file to a digital identifier – the equivalent of a fingerprint – and looks for a possible match in the NCEM file of more than 200,000 images of child sexual abuse.

If the AI ​​discovers multiple snapshots on an iPhone hard drive that are also on the NCEM’s list, the photos are sent to an Apple employee, who can then view them in “clear” to ensure that ‘it is indeed child pornography. For Apple, these safeguards make NeuralMatch a device that is not too intrusive. “If you store a collection of child abuse photos you will have problems, but for others it will not make a difference at all,” said Erik Neuenschwander, head of privacy matters for Apple, interviewed by the New York Times.

This new device “is clearly made to reassure the authorities and give goodwill to the police,” said Riana Pfefferkorn, specialist in cybersurveillance issues at the Internet Observatory of Stanford University. , during a roundtable between several cybersecurity experts devoted to the announcement of Apple.


A New York Times 2019 survey had, in fact, shown that the fight against child pornography was one of the weak points on all the major Internet platforms. And Apple is, so far, far behind other services in terms of detecting illegal content. “In 2019, Facebook reported 15.8 million child pornographic content, compared to 205 for Apple. There is clearly a problem of underestimation of the number of illegal images circulating thanks to Apple’s services,” highlights on Twitter Casey Newton, one of the most renowned American journalists in the field of new technologies.

The start of trouble ?

Apple is also reportedly preparing to strengthen data protection on iCloud by introducing end-to-end encryption (meaning all files saved in the cloud will become unreadable for Apple) in the coming months. “This is a long-awaited decision, and the announcement that has just been made may be a way for Apple to reassure the authorities by making them understand that it will still be possible to spot illegal content despite the encryption,” explains Riana Pfefferkorn.

But to multiply the pledges to the police who have criticized it so much in the past, Apple would have gone a long way, according to several cybersurveillance specialists. “It’s a terrible device that will inevitably transform our computers [macbook, NLDR], iPhone and iPad as a massive surveillance tool, “said Ross Anderson, a specialist in computer security at Cambridge University, interviewed by the Financial Times.

“This is how electronic surveillance sets in. First for the noblest of motives, then to hunt down gun footage, then find the photos of protests, then the chat screenshots. about demonstrations, etc. “, fears John Hamasaki, an American criminal lawyer, on Twitter.

Apple would have, in fact, opened a Pandora’s box by developing an AI that scans photos directly on smartphones or tablets. “From now on, it is enough to change a few parameters in the algorithm to allow it to find other types of photos”, provides the Electronic Frontier Foundation, one of the main US NGOs fighting for the protection of online privacy, in a statement released Thursday.

“Governments, including that of the United States, will inevitably ask Apple to adapt this device to the fight against terrorism. And we know how this notion can have different meanings from one country to another” , warns David Thiel, technical director of the Internet Observatory of Stanford, during a round table devoted to the initiative of Apple. The fight against terrorism has, for example, been the pretext used by China for its policy of persecuting the Muslim minority of the Uighurs.

And after terrorism? “I can very well imagine that the music and film lobby is demanding that Apple adapt this technology to find files that violate copyright,” said Riana Pfefferkorn.

“You had better have confidence in the management of Apple never to give in to all these demands. From now on, confidence is all we have left to avoid the slippages of this technology”, regrets Matthew Green, crypto researcher at Johns Hopkins University in Baltimore.

While waiting to see if the forecasts of these technological Cassandras come true, this device continues to fuel the vast debate on where to place the cursor of individual freedoms in the name of the fight for the noblest causes possible.

1 thought on “When Apple “sacrifices” a little privacy to better fight against child pornography”

  1. https://www.shortstoryproject.com/storyf/348081/
    https://connect.ohnurses.org/profile?UserKey=c329f880-4d5a-4396-bead-227c387a826e
    https://connect.ohnurses.org/profile?UserKey=9a71b133-5eff-46f8-af03-ed2c167d9157
    https://connect.ohnurses.org/profile?UserKey=deb66429-eb45-449a-b35c-0c388accef8e
    https://www.shortstoryproject.com/storyf/347944/
    https://connect.ohnurses.org/profile?UserKey=cdcd5b69-36ec-48d1-975d-48c369c033e7
    https://connect.ohnurses.org/profile?UserKey=e3a58e72-8f68-47cf-a272-e0ddd5d95cfd
    https://connect.ohnurses.org/profile?UserKey=ea2f0b8f-2008-448c-95e1-75340b254c23
    https://connect.ohnurses.org/profile?UserKey=12b588c2-66fe-4639-b21a-0593488e246d
    https://connect.ohnurses.org/profile?UserKey=41588480-592a-4742-a296-3aba908c2c95
    https://connect.ohnurses.org/profile?UserKey=393cac99-692d-4798-bf65-f3520e2b6efa
    https://www.shortstoryproject.com/storyf/347864/
    https://www.shortstoryproject.com/storyf/347881/
    https://connect.ohnurses.org/profile?UserKey=97996a3a-232f-4ffe-96ed-f873b1e3c0f0
    https://connect.ohnurses.org/profile?UserKey=8c33609c-e281-4d82-87fe-443ca262cd84
    https://www.shortstoryproject.com/storyf/347935/
    https://www.shortstoryproject.com/storyf/347779/
    https://www.shortstoryproject.com/storyf/348175/
    https://www.shortstoryproject.com/storyf/348242/
    https://connect.ohnurses.org/profile?UserKey=071bb80e-e73f-40ab-9efe-7cf630619e17
    https://connect.ohnurses.org/profile?UserKey=32cd067d-5510-4ee6-afe2-3b9269387191
    https://www.shortstoryproject.com/storyf/348139/
    https://connect.ohnurses.org/profile?UserKey=94e0c7ea-73e4-42b7-a1d4-965b1f5a68cf
    https://connect.ohnurses.org/profile?UserKey=01367c4a-663f-44b5-a102-ce7db63aeac5
    https://connect.ohnurses.org/profile?UserKey=a169db64-eee8-428c-8cce-b7a9128fa8f6

Comments are closed.