Apple tackles child sex abuse imagery: Slippery slope or necessary intervention?

Aug 19, 2021

Allen Murabayashi

We love it when our readers get in touch with us to share their stories. This article was contributed to DIYP by a member of our community. If you would like to contribute an article, please contact us here.

Apple tackles child sex abuse imagery: Slippery slope or necessary intervention?

Aug 19, 2021

Allen Murabayashi

We love it when our readers get in touch with us to share their stories. This article was contributed to DIYP by a member of our community. If you would like to contribute an article, please contact us here.

Join the Discussion

Share on:

Apple recently announced a new set of features aimed at combatting Child Sexual Abuse Materials (CSAM), which include the ability to scan a user’s phone and iMessages. Since the announcement, the company has reiterated the numerous safeguards that they developed, but privacy advocates have bemoaned the potential for abuse and “mission creep.”

The issue of CSAM proliferation has become an epidemic. The National Center for Missing and Exploited Children (NCMEC) is the U.S. agency responsible for tracking CSAM, has reported an exponential increase in the number of images and videos that it has received growing from 600,000 a decade ago to over 70 million in 2019.

Although pundits have speculated as to Apple’s motivation for releasing the feature, it’s clear that Apple’s status as a market maker means that other electronic service providers (ESPs) will be forced to take note. In the U.S., there is no legal obligation for companies to scan for CSAM, and consequently, many don’t – including Dropbox, Google, Amazon, and others – even though it’s almost a certainty that their systems store the content.

Facebook has been one of the few high profile ESPs to scan each image and video for CSAM, but as their former Chief Security Officer Alex Stamos explained in a recent Stanford Internet Observatory (SIO) video, the typical paradigm for CSAM detection is when content is shared (whether through a messaging app, or as a shared album). Apple’s proposed system works locally on your phone if the feature is enabled, and raises the question of who actually owns your computing device if a tech company can start snooping on images and video that are ostensibly private.

Apple has stated that the feature will roll out in the U.S. first, and will only be deployed in other countries after further examination. They have also unequivocally stated that the technology will only be used for CSAM, and not to satisfy another country’s demands (e.g. identifying “terrorists” or political dissidents).

For photographers, the potential breach of privacy is concerning. Photographers of all ilk – from photojournalists to landscape photographers – have legitimate reasons for ensuring that content isn’t seen by anyone else (human or machine) until they choose to disseminate or publish their images. The notion of Canon, Sony or Nikon running content scans on your camera is horrifying, and not an unfair analogy.

YouTube video

Stanford Internet Observatory Research Scholar, Rianna Pfefferkorn, made the point in a recent round table discussion that technology can’t fix the underlying sociological, historical, and poverty-fueled issues that lead to the conditions (particularly in South East Asia) where child rape and abuse can take place and be recorded.

That said, most experts agree that without a coordinated and concerted effort on the part of ESPs, the proliferation of CSAM will continue unabated. Apple’s solution might catalyze a more coherent industry response, or it might devolve into a slippery slope of a terrible ethical conundrum with a tragic human toll.

We mention the following photographers, articles, and websites in this episode of Vision Slightly Blurred.

About the Author

Allen Murabayashi is a graduate of Yale University, the Chairman and co-founder of PhotoShelter blog, and a co-host of the “Vision Slightly Blurred” podcast on iTunes. For more of his work, check out his website and follow him on Twitter. This article was also published here and shared with permission.

Filed Under:

Tagged With:

Find this interesting? Share it with your friends!

DIPY Icon

We love it when our readers get in touch with us to share their stories. This article was contributed to DIYP by a member of our community. If you would like to contribute an article, please contact us here.

Join the Discussion

DIYP Comment Policy
Be nice, be on-topic, no personal information or flames.

Leave a Reply

Your email address will not be published. Required fields are marked *

One response to “Apple tackles child sex abuse imagery: Slippery slope or necessary intervention?”

  1. Chris Lee Avatar
    Chris Lee

    This is more than a slippery slope. It’s a downhill race to serious erosion of a fundamental right: privacy. Private companies accessing your personal property is more than concerning.

    And given tech companies and their track record of earning our trust, this is a very bad situation for not just photographers, but anyone who uses their products. From their behavior over the last 30 years, I simple do not trust anything tech companies do or say with regard to privacy, protection, or concern for users data.

    I make a whole video recently on this announcement on the Pal2tech channel.

    And I am a father with two kids under the age of 13, as well as have used a couple of cameras and own dozens of Apple products. Nobody wants these sort of protections more than I. But this should be investigated and approved — and not just arbitrarily turned This is more than a slippery slope. It’s a downhill race to serious erosion of a fundamental right: privacy. Private companies accessing your personal property is more than concerning.

    And given tech companies and their track record of earning our trust, this is a very bad situation for not just photographers, but anyone who uses their products. And the start of a very, very concerning trend.