Apple’s CSAM Suppression: A Backdoor to Privacy?

Rate this post

Apple has touted itself as one of the best tech companies to compete with Facebook, Google and others to protect user privacy. But lately, the Cupertino giant has been surrounded by privacy enthusiasts who challenge Apple’s CSAM prevention features.

Earlier, Apple noticed that it was stuck in an NSO blunder. It has been found that it is easier to plant spyware on the iPhone than on Android.

Recently, Apple’s new ability to curb the distribution of child sexual abuse material (CSAM) has made the company controversial. Critics have accused Apple of devising ways that could compromise the privacy of millions of people.

Efforts to limit CSAM are commendable, but their implementation is frowned upon.

What are Apple’s new child safety features?

On Monday, Apple stepped up its efforts to prevent the distribution of CSAM on the platform. The company has introduced three new features available in iOS 15.

  • First, Apple will add new resources to Siri and Spotlight searches to help users find answers to CSAM-related queries. This feature is completely harmless and does not change anything in terms of privacy.
  • Next, Apple will modify its messaging app to add a parental control option that scans sent and received photos in messages for sexually explicit content for users under the age of 18. If found, blur the photo in question and display a warning message next to it.
Apple child safety protection
Source: Apple
  • The last and most controversial feature is to scan your iCloud Photo Library for potential CSAMs.

How does the CSAM feature work?

Parents will be notified when users under the age of 12 attempt to view flagged content. In particular, Apple scans photos on your device to make it unaware of its content.

Also, the ability to be notified when a child accesses sensitive content is optional.

In addition, Apple will add the ability to scan the iCloud Photos library locally for sexually explicit content and match it with CSAM data provided by the National Center for Missing and Exploiting Children (NCMEC). If a match is found, a “safety voucher” will be generated and uploaded to iCloud.

When the number of these safety vouchers reaches the private threshold limit, Apple moderators can decrypt the photo and scan for the presence of CSAM. If found, Apple can report the account to law enforcement agencies.

What does Apple say?

The whole “scan iCloud Photo Library” sounds like a bold and invasive move, but Apple emphasizes that this feature doesn’t affect user privacy. In particular, it scans photos uploaded to iCloud via Neural Hash.

This is a tool that assigns a unique identifier to each photo, which also obscures the content of the photo to Apple. The company emphasizes that users can disable syncing to iCloud features to stop all types of scans.

In addition, Apple’s official blog says:

The message uses machine learning on the device to analyze the image attachment to determine if the photo is sexually explicit. This feature is designed to prevent Apple from accessing your messages.

So why Hullabaloo? : Aspects of the critic’s story

Apple has repeatedly emphasized that these features are built with privacy in mind to protect children from sexual predators, but critics disagree. While the messaging feature performs scans on the device, Apple is essentially creating a snooping system that can have catastrophic consequences.

Harvard Cyberlaw Clinic Kendra Albert Insist It:

These “child protection” features can be costly for queers with unaccepted parents. They can be beaten and kicked out of the house. She says, for example, a strange kid who sends a transition photo to a friend could be flagged by a new feature.

Imagine the potential impact of this system in a country where homosexuality is not yet legal. Authoritarian governments can ask Apple to add LGBTQ + content to a list of known databases. The twist of the arm of this technical institution is something we have never seen before.

However, Apple has a pretty good record of such requests. For example, in 2016 we firmly denied the FBI’s request to decrypt data from the shooting iPhone.

However, in countries like China where Apple stores iCloud data locally, this could force Apple to meet their demands. Therefore, it completely jeopardizes the privacy of the user.

The Electronic Frontier Foundation said:

Apple’s commitment is “a fully built system that just waits for external pressure to make a slight change.”

The last feature is even more important from the privacy POV. Scanning iCloud photos clearly violates your privacy. This adds the ability to scan photos on your iPhone and match them with a set of illegal and sexually explicit content.

I just want to upload my photos to cloud storage.

Apple’s Past and Present: Growing Concerns

The company famously said when Apple refused to unlock the mass murderer’s iPhone against the FBI, “Your device is yours. It’s not ours.”

After all, the device seems to belong to Apple because it has no control over who is viewing the content on the device.

Users can claim that the photos they click are theirs, and Apple does not have the right to scan them. In my opinion, it would be indisputable for many.

MSNBC offers the idea of ​​a surveillance system built under the name “better” by comparing Apple’s iCloud photo scanning capabilities with NSO’s Pegasus software. According to the report

Consider the spyware capabilities that the Israeli company NSO Group has provided to the government to track terrorists and criminals. In some countries, they monitored activists and journalists. Now imagine that all iPhone and Mac computers on an airplane have the same functionality hard-coded.

It’s not too hard to imagine the negative effects of Apple’s implementation of this feature.

Apple wants you to trust it!

Since the company announced these features, privacy enthusiasts have been fiercely opposed to them. In response, Apple has released a PDF containing a FAQ on CSAM Prevention Initiatives.

In this document, Apple states that it will reject requests from government agencies to add non-CSAM images to the hash list. The PDF says:

Previously, we faced demands to build and deploy government-mandated changes that reduce user privacy, and we have categorically rejected those demands. I will refuse in the future.

In an interview with Techcrunch, Apple’s privacy officer, Erik Neuenschwander, sought to address feature concerns.

He states: “The device is still encrypted and doesn’t hold the key. The system is designed to work with the data on the device. What we designed has components on the device side — by the way. There are device-side components to improve privacy. Instead of just looking at and evaluating user data on the server, it’s actually more susceptible to changes. [without user knowledge], And the protection of user privacy is weakened. “

Apparently, Apple wants you to trust it in the private content of your phone. But no one knows when the company will return these promises.

The existence of a system that snoops on content that you only own before it is encrypted opens the can of worms. Once they have written a consent form and implemented the system, it will be difficult to limit it.

Will the benefits mask the risks of Apple’s new child safety features?

Apple’s efforts are great for fighting child sexual abuse, but it’s no excuse to look at your data. The very existence of surveillance systems creates the potential for security backdoors.

I believe transparency is the key to Apple’s achievement of neutrality. Some type of legal involvement can also help companies gain confidence in their plans to disrupt CSAM on the Internet.

Nevertheless, the introduction of this system invalidates the entire purpose of encryption. The company may launch a series of experiments that could be fatal to privacy in the future.

More importantly, it starts with Apple. This is the last company everyone expected of this. It’s true that you’ve lived long enough to see a hero die or you become a villain.

Read other editorials: