Apple to scan iPhones and iCloud for photos of child sexual abuse

Aug 6, 2021

Dunja Djudjic

Dunja Djudjic is a multi-talented artist based in Novi Sad, Serbia. With 15 years of experience as a photographer, she specializes in capturing the beauty of nature, travel, and fine art. In addition to her photography, Dunja also expresses her creativity through writing, embroidery, and jewelry making.

Apple to scan iPhones and iCloud for photos of child sexual abuse

Aug 6, 2021

Dunja Djudjic

Dunja Djudjic is a multi-talented artist based in Novi Sad, Serbia. With 15 years of experience as a photographer, she specializes in capturing the beauty of nature, travel, and fine art. In addition to her photography, Dunja also expresses her creativity through writing, embroidery, and jewelry making.

Join the Discussion

Share on:

Apple has announced that it’s about to increase its efforts to improve child safety. Starting from iOS 15, watchOS 8 and macOS Monterey, the company will scan your messages, Siri, searches, and photos in search for content that can be connected with child sexual abuse.

The company has published a post explaining what you can expect and how everything will work. The new features are aimed at educating and warning both children and parents of sexual abuse. Also, there’s an effort to limit the spread of Child Sexual Abuse Material (CSAM) online. Three areas will be covered: messaging, image storage, and searches.

Messages

From now on, Apple will scan messages your child exchanges with someone. In case they receive a sexually explicit photo, it will be blurred. The child will get a warning along with helpful resources and reassurance that it’s okay not to open the image. And if they do, their parents will get a notification, which is something Apple will also warn the child about. Similar protection goes the other way around too, warning the parents if the child sends a sexually explicit photo.

Image credits: Apple

You may feel a little freaked about Apple reading your messages. However, the company convinces us that the feature is designed so that they don’t have access to other messages than these.

Images

Another feature is designed with preventing the spread of CSAM online. It allows Apple to scan and detect known CSAM images stored in iCloud Photos. If such material is found, Apple will report it to the National Center for Missing and Exploited Children (NCMEC).

Once again, Apple says that the feature is “designed with user privacy in mind.”

“Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.”

Apple also uses the technology called threshold secret sharing. It ensures that Apple can only interpret the safety voucher if the iCloud Photos account crosses a threshold of known CSAM content. “The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account,” Apple claims.

Still, if your photo gets incorrectly flagged, you will be able to appeal Apple’s decision. You know, in case you share photos of onions and they get mistaken for boobs.

Siri and searches

Finally, Apple is also expanding Siri and searches in two ways. First, when someone asks Siri or searches for ways to report CSAM or child exploitation, the results will point them to resources for where and how to file a report. And second, if someone searches for queries related to CSAM, Siri and Search will intervene. The user will get informed that this topic is harmful and problematic and will get resources for getting help.

Image credits: Apple

 

[via Engadget]

Filed Under:

Tagged With:

Find this interesting? Share it with your friends!

Dunja Djudjic

Dunja Djudjic

Dunja Djudjic is a multi-talented artist based in Novi Sad, Serbia. With 15 years of experience as a photographer, she specializes in capturing the beauty of nature, travel, and fine art. In addition to her photography, Dunja also expresses her creativity through writing, embroidery, and jewelry making.

Join the Discussion

DIYP Comment Policy
Be nice, be on-topic, no personal information or flames.

Leave a Reply

Your email address will not be published. Required fields are marked *

7 responses to “Apple to scan iPhones and iCloud for photos of child sexual abuse”

  1. Graeme Simpson Avatar
    Graeme Simpson

    how about they add racial abuse to that scanning as well?

  2. Steven Naranjo Avatar
    Steven Naranjo

    Nope. Look I get that if anything they will work in conjunction with the FBI and use they disgustingly, plethora deep catalog of child abuse archive to see if that’s already on some ones phone. But we already know how much of a pain it is to get a real human to fix what their AI will “accidentally” flag. Bottom line, there’s no telling where that scanning stops.

  3. Camera operator Hong Kong Avatar
    Camera operator Hong Kong

    So Apple will scan all your pictures and compare to a database to see if it’s fits the “child abuse” criteria…
    A feature is “designed with user privacy in mind.” but done by a subcontractor that will be easy to blame in case scan is use for other purposes.

    Why I’m not convinced?

  4. Olivier Gallen Avatar
    Olivier Gallen

    The road to hell is paved with good intentions.

  5. Fletch Avatar
    Fletch

    It is time we all leave these Social Justice WOKE corporations that do not understand that it is our data and they should stop scanning thru it by constantly redefining their privacy terms. It is not their job to do policing. As far as I know, apple’s mission is to squeeze as much money as they can from their users while providing a lame service to their customers.
    This is borderline fascism.

  6. MegaNickels Avatar
    MegaNickels

    prepare yer buttholes for more privacy invasion peoples. Once govt gets it’s hands on this tech it’s all over boiz

  7. John Beatty Avatar
    John Beatty

    Since our (USA) justice system has a swinging door policy with child abusers I am good for any and all processes that can put a stop to victimizing children.