Apple scans users’ iPhones and iCloud for images of child abuse

Rate this post

Apple has always been committed to user privacy about its products and services.But now to protect minors “Predators who recruit and abuse them using communication tools”, Cupertino Giants announced that it will Scan photos stored on iPhone and iCloud Image of child abuse.

the system, Financial Times The report (paywall) is called a neuralMatch. If you find images or content related to Child Sexual Abuse Materials (CSAM), we aim to leverage a team of human reviewers to contact law enforcement agencies.Said The system was reportedly trained using 200,000 images. From the National Missing and Exploited Children’s Center. As a result, it compares Apple user photos with a database of known images of scans, hashes, and child sexual abuse.

“All photos uploaded to iCloud in the United States will be given a” safety voucher “indicating whether they are suspicious, according to those who were briefed on the plan. If a certain number of photos are marked as suspicious, Apple will allow you to decrypt all the suspicious photos and give them to the authorities if they are clearly illegal. “ Said Financial Times report.

Now, following the report, Apple has published an official post in the newsroom to further explain how the new tool works. These tools were developed in collaboration with child safety experts and Use machine learning on the device To warn children and parents about iMessage’s delicate and sexually explicit content.

Apple scans user images for material on child sexual abuse

In addition, the Cupertino Giants added that it would be integrated. “New technology” From iOS 15 and iPad OS 15 Detect CSAM images stored in iCloud Photos.. If the system detects an image or content related to CSAM, Apple will invalidate the user account and send a report to the National Center for Missing and Exploiting Children (NCMEC). However, if the user is mistakenly flagged by the system, you can apply to recover your account.

In addition to these, Apple has expanded Siri and Search guidance to keep parents and children safe online and to get relevant information in dangerous situations. The voice assistant will also be updated to interrupt searches related to CSAM.

Apple scans user images for material on child sexual abuse

Regarding the availability of these new tools and systems, Apple says it will be the first to deploy in the United States with the next iOS 15, iPadOS 15, WatchOS 8, and macOS Monterey updates. However, there is no information on whether the company will expand its tools and systems to other regions in the future.