Apple Answers Some Important Privacy-Related Questions in New CSAM FAQ

by -77 views

Final week, Apple announced the controversial kid safety features for iOS, iPadOS, and macOS. Despite the good apply, CSAM features have received a lot of backlash raising questions about privacy. Today, the Cupertino-based behemothic published a detailed FAQ answering some of the important questions regarding CSAM and user privacy.

Apple tree posted a 6-page FAQ after the feature drew a lot of backlashes. Privacy advocates, including Edward Snowden, have slammed the feature saying the feature will plow iPhone into “iNarcs.”


The FAQ beings with Apple re-assuring that the characteristic volition not exist used by any regime for any utilise.

“Let us exist clear, this engineering science is limited to detecting CSAM [kid sexual abuse material] stored in iCloud and nosotros will not accede to any authorities’s request to expand it.”

Apple so highlights that the two features — CSAM prophylactic in Messages and iCloud photo scanning — are entirely separate.

What are the differences between communication prophylactic in Messages and CSAM detection in iCloud Photos?

These two features are not the same and exercise non use the aforementioned technology.

Communication condom in Messages is designed to give parents and children additional tools to help protect their children from sending and receiving sexually explicit images in the Messages app. It works only on images sent or received in the Messages app for child accounts ready upwards in Family Sharing. Information technology analyzes the images on-device, and and so does not change the privacy assurances of Messages. When a child account sends or receives sexually explicit images, the photograph will be blurred and the kid will exist warned, presented with helpful resources, and reassured it is okay if they do non want to view or send the photo. Equally an additional precaution, young children can besides be told that, to make sure they are condom, their parents will get a bulletin if they do view it.

The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple tree about any photos other than those that friction match known CSAM images. CSAM images are illegal to possess in most countries, including the The states. This feature just impacts users who accept chosen to use iCloud Photos to shop their photos. Information technology does non bear upon users who accept not chosen to use iCloud Photos. There is no bear upon to whatever other on-device data. This feature does not utilize to Messages.

When asked about government enforcing Apple to detect other things via CSAM features, Apple answers:

Could governments force Apple to add together non-CSAM images to the hash list?

Apple will refuse any such demands. Apple tree’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other kid prophylactic groups. We accept faced demands to build and deploy government-mandated changes that degrade the privacy of users earlier, and have steadfastly refused those demands. Nosotros will continue to reject them in the future. Allow united states of america exist clear, this engineering science is limited to detecting CSAM stored in iCloud and we will not acquiesce to whatsoever government’south request to expand it. Furthermore, Apple tree conducts human review before making a report to NCMEC. In a example where the system flags photos that practise non match known CSAM images, the account would non exist disabled and no report would be filed to NCMEC.

The rest of the FAQ goes into specifics of the 3 features, namely Communication safety in Letters, CSAM detection, and Security for CSAM detection for iCloud Photos. You lot can read more almost the FAQ here.

With the FAQ, Apple is aiming to resolve all the problems regarding privacy and CSAM. Only, practice you feel convinced? How exercise you feel about Apple tree searching the iCloud Photo Library for CSAM? Do you retrieve this is a breach of your privacy? Drop a comment and let u.s.a. know your thoughts!


Posted by: