Apple responds to iCloud Photos scanning concerns in internal memo

17comments
Apple responds to iCloud Photos scanning concerns in internal memo
On Thursday, Apple announced a trio of new efforts aimed at adding new child protection features to the iPhone, iPad, and Mac in the United States. Now, the company has addressed concerns in an internal memo.

Apple says it will continue to explain and detail the features


Distributed to the teams that worked on the initiative and obtained by 9to5Mac, the internal memo sees Apple acknowledge recent “misunderstandings” and worries about the implications of the tech.

Nevertheless, the company says that it will “continue to explain and detail features so people understand” what has been built. In the meantime, Apple says “a lot of hard work lays ahead to deliver the features in the next few months.”

Criticism has so far been focused primarily on Apple’s plans to scan iCloud Photos for Child Sexual Abuse Material (CSAM), with many arguing that the feature could open doors to other, more worrying surveillance uses.

One of the most high-profile critics so far is Edward Snowden, who said the following: “No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.”

The full internal memo sent to Apple employees and written by Software Vice President Sebastien Marineau-Mes is below:


Recommended Stories

What do Apple’s child protection features consist of?

iCloud


To combat the spread of Child Sexual Abuse Material, content that depicts sexually explicit activities involving a child, Apple is adding a CSAM detection feature to its iCloud Photos service.

Before photos are uploaded to iCloud Photos, Apple devices will now analyze the images to find any matches against a database of CSAM images provided by the National Center for Missing and Exploited Children.

Apple claims to have transformed the database into an “unreadable set of hashes that is securely stored on users’ devices.” The matching process is powered by cryptographic safety vouchers that encode the match results.

The contents of the safety vouchers can’t be interpreted by Apple unless a specific threshold of CSAM content. If that threshold is reached, Apple will manually review the report to confirm any matches and disable the user’s account.

Additionally, the company says it will send a report to the National Center for Missing and Exploited Children.

Importantly, this feature is currently limited to the United States, though Apple hopes to expand elsewhere in the future. Additionally, photos stored only on-device are not included in the scanning process.

Messages


Alongside the controversial iCloud Photos plans, Apple unveiled a new opt-in communication safety feature in the Messages app for children who are part of an iCloud Family.

The feature uses on-device machine learning to analyze the contents of photos. If a child receives a sexually explicit image, the child will see a warning and the image will be blurred inside the Messages app.

If the child chooses to tap “View photo,” a brief pop-up message will appear to inform why the image is considered sensitive. If the child chooses to proceed, their iCloud Family parent will be notified.

That will also be explained in the pop-up message. Links to additional help will be present too. Crucially, the notification for parents is only available for children under the age of 13.


Siri and Search


The last update Apple announced was an expansion to guidance in Siri and Search. Specifically, the services will provide additional resources to help both children and parents stay safe and get help when needed.

All of these features will arrive later this year in an update to iOS 15, iPadOS 15, watchOS 8, and macOS 12.

Recommended Stories

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless