Apple finally admits the CSAM scanning flaw we all pointed out at the time<div class="feat-image">
</div><p>Almost nine months after Apple confirmed that <a href="
https://9to5mac.com/2022/12/07/apple-confirms-that-it-has-stopped-plans-to-roll-out-csam-detection-system/" target="_blank" rel="noreferrer noopener">it had abandoned plans[/url] to carry out <a href="
https://9to5mac.com/guides/csam/" target="_blank" rel="noreferrer noopener">CSAM scanning[/url], the company has finally admitted the flaw which <a href="
https://9to5mac.com/2021/08/05/scanning-for-child-abuse-images/" target="_blank" rel="noreferrer noopener">so many of us pointed out at the time[/url].</p>
<p>The company explained the reason it decided against scanning devices for child sexual abuse materials (CSAM) in a statement to
Wired … </p>
<a href="
https://9to5mac.com/2023/09/01/csam-scanning-flaw/#more-904285" data-post-id="904285" data-layer-pagetype="post" data-layer-postcategory="aapl,csam,privacy" data-layer-viewtype="unknown" class="more-link">moreâ
Apple finally admits the CSAM scanning flaw we all pointed out at the time