Apple has announced that it’s about to increase its efforts to improve child safety. Starting from iOS 15, watchOS 8 and macOS Monterey, the company will scan your messages, Siri, searches, and photos in search for content that can be connected with child sexual abuse.
The company has published a post explaining what you can expect and how everything will work. The new features are aimed at educating and warning both children and parents of sexual abuse. Also, there’s an effort to limit the spread of Child Sexual Abuse Material (CSAM) online. Three areas will be covered: messaging, image storage, and searches.
Messages
From now on, Apple will scan messages your child exchanges with someone. In case they receive a sexually explicit photo, it will be blurred. The child will get a warning along with helpful resources and reassurance that it’s okay not to open the image. And if they do, their parents will get a notification, which is something Apple will also warn the child about. Similar protection goes the other way around too, warning the parents if the child sends a sexually explicit photo.

Image credits: Apple
You may feel a little freaked about Apple reading your messages. However, the company convinces us that the feature is designed so that they don’t have access to other messages than these.
Images
Another feature is designed with preventing the spread of CSAM online. It allows Apple to scan and detect known CSAM images stored in iCloud Photos. If such material is found, Apple will report it to the National Center for Missing and Exploited Children (NCMEC).
Once again, Apple says that the feature is “designed with user privacy in mind.”
“Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.”
Apple also uses the technology called threshold secret sharing. It ensures that Apple can only interpret the safety voucher if the iCloud Photos account crosses a threshold of known CSAM content. “The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account,” Apple claims.
Still, if your photo gets incorrectly flagged, you will be able to appeal Apple’s decision. You know, in case you share photos of onions and they get mistaken for boobs.
Siri and searches
Finally, Apple is also expanding Siri and searches in two ways. First, when someone asks Siri or searches for ways to report CSAM or child exploitation, the results will point them to resources for where and how to file a report. And second, if someone searches for queries related to CSAM, Siri and Search will intervene. The user will get informed that this topic is harmful and problematic and will get resources for getting help.

Image credits: Apple
[via Engadget]
FIND THIS INTERESTING? SHARE IT WITH YOUR FRIENDS!