Title: University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology Post by: HCK on August 23, 2021, 04:05:09 pm University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology
Respected university researchers are sounding the alarm bells over the technology behind Apple's plans to scan iPhone (https://www.macrumors.com/guide/iphone/) users' photo libraries for CSAM, or child sexual abuse material, calling the technology "dangerous." (https://images.macrumors.com/article-new/2021/01/apple-privacy.jpg) Jonanath Mayer, an assistant professor of computer science and public affairs at Princeton University, as well as Anunay Kulshrestha, a researcher at Princeton University Center for Information Technology Policy, both penned an op-ed (https://www.washingtonpost.com/opinions/2021/08/19/apple-csam-abuse-encryption-security-privacy-dangerous/) for The Washington Post, outlining their experiences with building image detection technology. The researchers started a project two years ago to identity CSAM in end-to-end encrypted online services. The researchers note that given their field, they "know the value of end-to-end encryption, which protects data from third-party access." That concern, they say, is what horrifies them over CSAM "proliferating on encrypted platforms." Mayer and Kulshrestha said they wanted to find a middle ground for the situation: build a system that online platforms could use to find CSAM and protect end-to-end encryption. The researchers note that experts in the field doubted the prospect of such a system, but they did manage to build it and in the process noticed a significant problem. We sought to explore a possible middle ground, where online services could identify harmful content while otherwise preserving end-to-end encryption. The concept was straightforward: If someone shared material that matched a database of known harmful content, the service would be alerted. If a person shared innocent content, the service would learn nothing. People couldn't read the database or learn whether content matched, since that information could reveal law enforcement methods and help criminals evade detection.Since Apple's announcement of the feature, the company has been bombarded with concerns (https://www.macrumors.com/2021/08/05/security-researchers-alarmed-apple-csam-plans/) that the system behind detecting CSAM could be used to detect other forms of photos at the request of oppressive governments. Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments. Nonetheless, concerns over the future implications of the technology being used for CSAM detection are widespread. Mayer and Kulshrestha said that their concerns over how governments could use the system to detect content other than CSAM had them "disturbed." A foreign government could, for example, compel a service to out people sharing disfavored political speech. That's no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident material. India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook and Twitter for not removing pro-democracy protest materials.Apple has continued to address user concerns over its plans, publishing additional documents (https://www.macrumors.com/2021/08/13/apple-child-safety-features-new-details/) and an FAQ page (https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/). Apple continues to believe that its CSAM detection system, which will occur on a user's device, aligns with its long-standing privacy values.<div class="linkback">Tags: Apple privacy (https://www.macrumors.com/guide/apple-privacy/), WashingtonPost.com (https://www.macrumors.com/guide/washingtonpost-com/), Apple child safety features (https://www.macrumors.com/guide/apple-child-safety-features/)</div> This article, "University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology (https://www.macrumors.com/2021/08/20/university-researchers-csam-dangerous/)" first appeared on MacRumors.com (https://www.macrumors.com) Discuss this article (https://forums.macrumors.com/threads/university-researchers-who-built-a-csam-scanning-system-urge-apple-to-not-use-the-dangerous-technology.2308132/) in our forums <div class="feedflare"> <img src="http://feeds.feedburner.com/~ff/MacRumors-Front?d=yIl2AUoC8zA" border="0"></img> (http://feeds.macrumors.com/~ff/MacRumors-Front?a=SLVZFEtel8w:FwtwB9p_TN8:yIl2AUoC8zA) <img src="http://feeds.feedburner.com/~ff/MacRumors-Front?d=6W8y8wAjSf4" border="0"></img> (http://feeds.macrumors.com/~ff/MacRumors-Front?a=SLVZFEtel8w:FwtwB9p_TN8:6W8y8wAjSf4) <img src="http://feeds.feedburner.com/~ff/MacRumors-Front?d=qj6IDK7rITs" border="0"></img> (http://feeds.macrumors.com/~ff/MacRumors-Front?a=SLVZFEtel8w:FwtwB9p_TN8:qj6IDK7rITs) </div><img src="http://feeds.feedburner.com/~r/MacRumors-Front/~4/SLVZFEtel8w" height="1" width="1" alt=""/> Source: University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology (https://www.macrumors.com/2021/08/20/university-researchers-csam-dangerous/) |