Title: CSAM scanning would be abused, says Apple – using argument it originally rejected Post by: HCK on February 28, 2024, 04:05:07 pm CSAM scanning would be abused, says Apple – using argument it originally rejected
<div class="feat-image">(https://9to5mac.com/wp-content/uploads/sites/6/2024/02/CSAM-scanning-would-be-abused.jpg?quality=82&strip=all&w=1600)</div><p>When <a href="https://9to5mac.com/2021/08/05/apple-announces-new-protections-for-child-safety-imessage-safety-icloud-photo-scanning-more/" target="_blank" rel="noreferrer noopener">Apple announced its own approach[/url] to <a href="https://9to5mac.com/guides/csam/" target="_blank" rel="noreferrer noopener">CSAM[/url] scanning, many of us warned that the process used to check for child sexual abuse materials <a href="https://9to5mac.com/2021/08/05/scanning-for-child-abuse-images/" target="_blank" rel="noreferrer noopener">would ultimately be abused by repressive governments[/url] to scan for things like political protest plans.</p> <p>The Cupertino company rejected that reasoning at the time, but in an ironic twist is now using precisely this argument in response to the Australian government … </p> <a href="https://9to5mac.com/2024/02/22/csam-scanning-apple-australia/#more-935076" data-post-id="935076" data-layer-pagetype="post" data-layer-postcategory="aapl,csam,privacy" data-layer-viewtype="unknown" class="more-link">more…[/url] Source: CSAM scanning would be abused, says Apple – using argument it originally rejected (https://9to5mac.com/2024/02/22/csam-scanning-apple-australia/) |