Apple Publishes CSAM FAQ to Deal With Questions Surrounding Its Botched Launch
Apple has published a new FAQ that is designed to answer some of the questions people have raised since the company announced its plans to detect CSAM material in images uploaded to iCloud Photos.
“Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of this new solution, and some have reached out with questions,” the FAQ notes before going on to say “this document serves to address these questions and provide more clarity and transparency in the process.”
The full document is well worth a read and we won’t be putting all of the data here, but there are some important answers to some questions people have raised over the weekend.
They include the important question surrounding whether CSAM checking can be used to check for other kinds of data in the future.
Can the CSAM detection system in iCloud Photos be used to detect things other than CSAM?
Our process is designed to prevent that from happening. CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations. There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, including the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities.
Apple also noted that it will “refuse” any demands by governments to request non-CSAM images to be added to the hash list of media that is being checked for within iCloud Photos.