Apple Is Adding New Safety Features That Will Scan Your iMessage And Photo Library For Child Abuse Images

Apple has announced that it is making changes that will add additional child safety measures to iPhones, iPads, and Macs later this year. It’ll only be available in the United States at launch but will be expanded internationally over time.

The first change will be a new feature in the Messages app that will be called Communication Safety.

Apple says that it will warn kids and their parents when receiving or sending sexually explicit photos. If an image is determined to be explicit, it will automatically be blurred and a warning triggered.

When a child attempts to view an image that has been flagged by Communication Safety, they’ll receive a message that tells them they are about to view images that could be hurtful. Parents can also opt to receive an alert at this time, too.

The new Communication Safety feature will be part of the upcoming iOS 15, iPadOS 15, and macOS Monterey updates all of which are expected to arrive in or around September.

Apple will also use on-device detection to tell whether someone has an image in their Photos app that has been flagged as containing explicit child content. Known Child Sexual Abuse Material (CSAM) images will be detected before it is uploaded to iCloud Photos, with Apple checking image hashes against a list of known hashes on-device. Each time this happens, a marker is placed against the account. After an unknown threshold is breached — a certain number of images have been found, essentially — Apple will manually review the images and report the account to the authorities.

Apple says this method is better than others that handle image checking on a company’s servers, mainly because it offers increased privacy. Notably, any device with iCloud Photos disabled won’t have this checking applied.

• This system is an effective way to identify known CSAM stored in iCloud Photos accounts while protecting user privacy.
• As part of the process, users also can’t learn anything about the set of known CSAM images that is used for matching. This protects the contents of the database from malicious use.
• The system is very accurate, with an extremely low error rate of less than one in one trillion account per year.
• The system is significantly more privacy-preserving than cloud-based scanning, as it only reports users who have a collection of known CSAM stored in iCloud Photos.

In addition to this, Siri is also getting smarter about CSAM and can be asked how to report content and find help for children who need it.

You may also like to check out:

You can follow us on Twitter, or Instagram, and even like our Facebook page to keep yourself updated on all the latest from Microsoft, Google, Apple, and the Web.