Apple to scan iPhones and iCloud for photos of child sexual abuse
Aug 6, 2021
Share:

Apple has announced that it’s about to increase its efforts to improve child safety. Starting from iOS 15, watchOS 8 and macOS Monterey, the company will scan your messages, Siri, searches, and photos in search for content that can be connected with child sexual abuse.
The company has published a post explaining what you can expect and how everything will work. The new features are aimed at educating and warning both children and parents of sexual abuse. Also, there’s an effort to limit the spread of Child Sexual Abuse Material (CSAM) online. Three areas will be covered: messaging, image storage, and searches.
Messages
From now on, Apple will scan messages your child exchanges with someone. In case they receive a sexually explicit photo, it will be blurred. The child will get a warning along with helpful resources and reassurance that it’s okay not to open the image. And if they do, their parents will get a notification, which is something Apple will also warn the child about. Similar protection goes the other way around too, warning the parents if the child sends a sexually explicit photo.

You may feel a little freaked about Apple reading your messages. However, the company convinces us that the feature is designed so that they don’t have access to other messages than these.
Images
Another feature is designed with preventing the spread of CSAM online. It allows Apple to scan and detect known CSAM images stored in iCloud Photos. If such material is found, Apple will report it to the National Center for Missing and Exploited Children (NCMEC).
Once again, Apple says that the feature is “designed with user privacy in mind.”
“Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.”
Apple also uses the technology called threshold secret sharing. It ensures that Apple can only interpret the safety voucher if the iCloud Photos account crosses a threshold of known CSAM content. “The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account,” Apple claims.
Still, if your photo gets incorrectly flagged, you will be able to appeal Apple’s decision. You know, in case you share photos of onions and they get mistaken for boobs.
Siri and searches
Finally, Apple is also expanding Siri and searches in two ways. First, when someone asks Siri or searches for ways to report CSAM or child exploitation, the results will point them to resources for where and how to file a report. And second, if someone searches for queries related to CSAM, Siri and Search will intervene. The user will get informed that this topic is harmful and problematic and will get resources for getting help.

[via Engadget]
Dunja Đuđić
Dunja Djudjic is a multi-talented artist based in Novi Sad, Serbia. With 15 years of experience as a photographer, she specializes in capturing the beauty of nature, travel, concerts, and fine art. In addition to her photography, Dunja also expresses her creativity through writing, embroidery, and jewelry making.




































Join the Discussion
DIYP Comment Policy
Be nice, be on-topic, no personal information or flames.
7 responses to “Apple to scan iPhones and iCloud for photos of child sexual abuse”
how about they add racial abuse to that scanning as well?
Nope. Look I get that if anything they will work in conjunction with the FBI and use they disgustingly, plethora deep catalog of child abuse archive to see if that’s already on some ones phone. But we already know how much of a pain it is to get a real human to fix what their AI will “accidentally” flag. Bottom line, there’s no telling where that scanning stops.
So Apple will scan all your pictures and compare to a database to see if it’s fits the “child abuse” criteria…
A feature is “designed with user privacy in mind.” but done by a subcontractor that will be easy to blame in case scan is use for other purposes.
Why I’m not convinced?
The road to hell is paved with good intentions.
It is time we all leave these Social Justice WOKE corporations that do not understand that it is our data and they should stop scanning thru it by constantly redefining their privacy terms. It is not their job to do policing. As far as I know, apple’s mission is to squeeze as much money as they can from their users while providing a lame service to their customers.
This is borderline fascism.
prepare yer buttholes for more privacy invasion peoples. Once govt gets it’s hands on this tech it’s all over boiz
Since our (USA) justice system has a swinging door policy with child abusers I am good for any and all processes that can put a stop to victimizing children.