Apple has confirmed that its upcoming CSAM image scanning feature won’t work on devices that don’t have iCloud Photos enabled. In other words, devices that don’t already upload their photos to iCloud won’t be scanned to ensure photos don’t breach child safety laws.

Apple caused quite the controversy yesterday when it confirmed that future software updates will see iPhones, iPads, and Macs check photos to ensure that they don’t feature child sexual abuse material (CSAM).

But Apple has confirmed to MacRumors that the feature will only run when iCloud Photos is enabled.

CSAM image scanning is not an optional feature and it happens automatically, but Apple has confirmed to MacRumors that it cannot detect known CSAM images if the ‌iCloud Photos‌ feature is turned off.

Further, Apple has also confirmed that it cannot delve into iCloud backups and check for CSAM images, meaning the only time it will look at your photos is if you have a device signed into iCloud and with iCloud Photos enabled.

Whether this will simply mean that people with such images on their devices won’t enable iCloud Photos isn’t clear, but we can only hope that they don’t know about this feature and are brought to justice as a result.

You may also like to check out:

You can follow us on Twitter, or Instagram, and even like our Facebook page to keep yourself updated on all the latest from Microsoft, Google, Apple, and the Web.

Related Stories