Apples says its child abuse prevention system will only scan already flagged photos

Aug 16, 2021

Dunja Djudjic

Dunja Djudjic is a multi-talented artist based in Novi Sad, Serbia. With 15 years of experience as a photographer, she specializes in capturing the beauty of nature, travel, and fine art. In addition to her photography, Dunja also expresses her creativity through writing, embroidery, and jewelry making.

Apples says its child abuse prevention system will only scan already flagged photos

Aug 16, 2021

Dunja Djudjic

Dunja Djudjic is a multi-talented artist based in Novi Sad, Serbia. With 15 years of experience as a photographer, she specializes in capturing the beauty of nature, travel, and fine art. In addition to her photography, Dunja also expresses her creativity through writing, embroidery, and jewelry making.

Join the Discussion

Share on:

Apple has faced harsh criticism after announcing that it will scan iCloud and iPhones for photos of child sexual abuse. The company has now responded to it and clarified that it will only scan the photos that have already been flagged internationally.

One of the features allows Apple to scan stored iCloud images in search of known CSAM (Child Sexual Abuse Material) images. If any of them are found, Apple will report it to the National Center for Missing and Exploited Children (NCMEC). According to Reuters, Apple said on Friday that it will “hunt only for pictures that have been flagged by clearinghouses in multiple countries.”

This feature uses an automatic scanning system to go through your photos. After they reach a certain threshold, Apple gets an alert and a human is allowed to review the content on your iCloud. Apple initially refused to say what the threshold is, but has now confirmed that it’s 30 images. The company added that this number would eventually be reduced in the future as the system improves. Also, the “surveillance,” as this feature was dubbed by some, will only be focused on the US for now.

When asked whether criticism influenced the decisions, Apple declined to respond. The company said that “the project was still in development and changes were to be expected.”

[via Engadget]

Filed Under:

Tagged With:

Find this interesting? Share it with your friends!

Dunja Djudjic

Dunja Djudjic

Dunja Djudjic is a multi-talented artist based in Novi Sad, Serbia. With 15 years of experience as a photographer, she specializes in capturing the beauty of nature, travel, and fine art. In addition to her photography, Dunja also expresses her creativity through writing, embroidery, and jewelry making.

Join the Discussion

DIYP Comment Policy
Be nice, be on-topic, no personal information or flames.

Leave a Reply

Your email address will not be published. Required fields are marked *

One response to “Apples says its child abuse prevention system will only scan already flagged photos”

  1. Fletch Avatar
    Fletch

    Ah! That makes it better. NO! Stop scanning through my devices as if I was guilty of something. Let the police do the policing. What is next? A system on my phone that detects illegal conversations? Hmm,.. That’s not a bad idea