Security Researchers Express Alarm Over Apple's Plans to Scan iCloud Images, But Practice Already WidespreadApple
today announced that with the launch of
iOS 15 and
iPadOS 15, it will begin scanning
iCloud Photos in the U.S. to look for known Child Sexual Abuse Material (CSAM), with plans to report the findings to the National Center for Missing and Exploited Children (NCMEC).
Prior to when Apple detailed its plans, news of the CSAM initiative leaked, and security researchers have already begun expressing concerns about how Apple's new image scanning protocol could be used in the future, as noted by
Financial Times.
Apple is using a "NeuralHash" system to compare known CSAM images to photos on a user's
iPhone before they're uploaded to
iCloud. If there is a match, that photograph is uploaded with a cryptographic safety voucher, and at a certain threshold, a review is triggered to check if the person has CSAM on their devices.
At the current time, Apple is using its image scanning and matching technology to look for child abuse, but researchers worry that in the future, it could be adapted to scan for other kinds of imagery that are more concerning, like anti-government signs at protests.
In a series of tweets, Johns Hopkins cryptography researcher
Matthew Green said that CSAM scanning is a "really bad idea" because in the future, it could expand to scanning end-to-end encrypted photos rather than just content that's uploaded to ‌iCloud‌. For children, Apple
is implementing a separate scanning feature that looks for sexually explicit content directly in iMessages, which are end-to-end encrypted.
Green also raised concerns over the hashes that Apple plans to use because there could potentially be "collisions," where someone sends a harmless file that shares a hash with CSAM and could result in a false flag.
Apple for its part says that its scanning technology has an "extremely high level of accuracy" to make sure accounts are not incorrectly flagged, and reports are manually reviewed before a person's ‌iCloud‌ account is disabled and a report is sent to NCMEC.
Green believes that Apple's implementation will push other tech companies to adopt similar techniques. "This will break the dam," he wrote. "Governments will demand it from everyone." He compared the technology to "tools that repressive regimes have deployed."
<div class="center-wrap"><blockquote class="twitter-tweet" data-conversation="none"><p lang="en" dir="ltr">These are bad things. I don’t particularly want to be on the side of child porn and I’m not a terrorist. But the problem is that encryption is a powerful tool that provides privacy, and you can’t really have strong privacy while also surveilling every image anyone sends.</p>— Matthew Green (@matthew_d_green)
August 5, 2021 <script async src="
https://platform.twitter.com/widgets.js" charset="utf-8"></script></div>
Security researcher Alec Muffett, who formerly worked at Facebook, said that Apple's decision to implement this kind of image scanning was a "huge and regressive step for individual privacy." "Apple are walking back privacy to enable 1984," he said.
Ross Anderson, professor of security engineering at the University of Cambridge said called it an "absolutely appalling idea" that could lead to "distributed bulk surveillance" of devices.
As many have pointed out on Twitter, multiple tech companies already do image scanning for CSAM. Google, Twitter, Microsoft, Facebook, and others use image hashing methods to look for and report known images of child abuse.
<div class="center-wrap"><blockquote class="twitter-tweet"><p lang="en" dir="ltr">And if you’re wondering whether Google scans images for child abuse imagery, I answered that in the story I wrote eight years ago: it’s been doing that **SINCE 2008**. Maybe all sit down and put your hats back on.
pic.twitter.com/ruJ4Z8SceY</p>— Charles Arthur (@charlesarthur)
August 5, 2021 <script async src="
https://platform.twitter.com/widgets.js" charset="utf-8"></script></div>
It's also worth noting that Apple was
already scanning some content for child abuse images prior to the rollout of the new CSAM initiative. In 2020, Apple chief privacy officer Jane Horvath said that Apple used screening technology to look for illegal images and then disables accounts if evidence of CSAM is detected.
Apple in 2019
updated its privacy policies to note that it would scan uploaded content for "potentially illegal content, including child sexual exploitation material," so today's announcements are not entirely new.<div class="linkback">Tag:
Apple privacy</div>
This article, "
Security Researchers Express Alarm Over Apple's Plans to Scan iCloud Images, But Practice Already Widespread" first appeared on
MacRumors.comDiscuss this article in our forums
<div class="feedflare">
<img src="[url]http://feeds.feedburner.com/~ff/MacRumors-Front?d=yIl2AUoC8zA" border="0"></img>[/url]
<img src="[url]http://feeds.feedburner.com/~ff/MacRumors-Front?d=6W8y8wAjSf4" border="0"></img>[/url]
<img src="[url]http://feeds.feedburner.com/~ff/MacRumors-Front?d=qj6IDK7rITs" border="0"></img>[/url]
</div><img src="
http://feeds.feedburner.com/~r/MacRumors-Front/~4/gGvmQHH9k2c" height="1" width="1" alt=""/>
Source:
Security Researchers Express Alarm Over Apple's Plans to Scan iCloud Images, But Practice Already Widespread