Apple postpones deployment of CSAM photo scanning system due to privacy concerns

Rate this post
Apple postpones plans for CSAM photo scanning system

As part of an effort to improve child safety, Apple announced plans earlier last month to scan iCloud photos for potential child sexual abuse material (CSAM). Apple has postponed the rollout of CSAM detection in response to backlash from security experts and digital rights groups such as the Electronic Frontier Foundation.

Apple delays deployment of CSAM detection

Apple was initially ready to deploy CSAM detection later this year. This applies to accounts configured as a family in iCloud for iOS 15, iPadOS 15, and macOS Monterey.Cupertino giant Did not reveal new date for deploying features not yet. Apple also doesn’t elaborate on which aspects of CSAM detection it plans to improve, or how to approach features to provide a healthy balance between privacy and security.

“Previously, we announced plans for features aimed at protecting children from predators who recruit and abuse children using communication tools and limiting the spread of child sexual abuse material. Based on feedback from organizations, researchers, etc., we decided to spend more time collecting and improving input over the next few months before releasing these very important child safety features. . “ Apple said in an official statement.

please remember. Apple’s CSAM detection works on your device and doesn’t scan images in the cloud. Attempts to detect known CSAM image hashes provided by NCMEC and other child safety organizations. The matching process on this device takes place just before uploading the image to iCloud Photos. However, researchers later discovered a hash collision that could essentially detect an image as a false positive. It also revealed that Apple has been scanning iCloud Mail for child abuse since 2019.