Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage [Updated]Apple has quietly nixed all mentions of CSAM from its
Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods.
Apple in August
announced a planned suite of new child safety features, including scanning users'
iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in
Siri and Search.
Following their announcement, the features were criticized by a wide range of individuals and organizations, including
security researchers, the privacy whistleblower
Edward Snowden, the
Electronic Frontier Foundation (EFF), Facebook's
former security chief,
politicians,
policy groups,
university researchers, and even some
Apple employees.
The majority of criticism was leveled at Apple's planned on-device CSAM detection, which was
lambasted by researchers for relying on dangerous technology that bordered on surveillance, and derided for being ineffective at identifying images of child sexual abuse.
Apple initially attempted to dispel some misunderstandings and reassure users by
releasing detailed information, sharing FAQs,
various new documents,
interviews with company executives, and more, in order to allay concerns.
However, despite Apple's efforts, the controversy didn't go away. Apple eventually went ahead with the Communication Safety features rollout for Messages, which went live earlier this week with the
release of iOS 15.2, but Apple decided to
delay the rollout of CSAM following the torrent of criticism that it clearly hadn't anticipated.
Apple said its decision to delay was "based on feedback from customers, advocacy groups, researchers and others... we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
The above statement was
added to Apple's Child Safety page, but it has now gone, along with all mentions of CSAM, which raises the possibility that Apple could have kicked it into the long grass and abandoned the plan altogether. We've reached out to Apple for comment and will update this article if we hear back.
Update: Apple spokesperson Shane Bauer told
The Verge that though the CSAM detection feature is no longer mentioned on its website, plans for CSAM detection have not changed since September, which means CSAM detection is still coming in the future.
"Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," Apple said in September.<div class="linkback">Tag:
Apple child safety features</div>
This article, "
Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage [Updated]" first appeared on
MacRumors.comDiscuss this article in our forums
<div class="feedflare">
<img src="http://feeds.feedburner.com/~ff/MacRumors-Front?d=yIl2AUoC8zA" border="0"></img> <img src="http://feeds.feedburner.com/~ff/MacRumors-Front?d=6W8y8wAjSf4" border="0"></img> <img src="http://feeds.feedburner.com/~ff/MacRumors-Front?d=qj6IDK7rITs" border="0"></img></div>
Source:
Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage [Updated]