Earlier this year, Apple announced that it would scan your iPhone and iCloud photos for child sexual abuse. A fierce backlash ensued, and now it looks like that the company has ditched the project completely without telling anyone.
Apple first announced the feature in August 2021. It was designed to scan your child’s messages, as well as your images, Siri, and searches. It was all in an attempt to detect Child Sexual Abuse Material (CSAM) and to stop it from spreading online.
As Apple explained back then, the feature wouldn’t store your photos anywhere. Instead, it would only collect image hashes (“digital fingerprints”) and compare them with the known CSAM hashes. However, this wasn’t enough to stop the backlash, equally from individuals and companies like Meta (oh the irony).
Following the backlash, Apple said that the child abuse prevention system would only scan photos that were already flagged. But then, in September, the company decided to postpone the feature. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” an Apple spokesperson said at the time. However, it looks like the planes have changed again.
The iOS 15.2 was released earlier this week, and the Communication Safety features were rolled out for messages. however, As Mac Rumors noticed, Apple decided to delay the rollout of CSAM. According to the same source, any mention of the feature has also disappeared from Apple’s website.
One might think that Apple has decided to ditch the feature completely, considering all the criticism and concerns. However, it appears that the launch has only been delayed further. An Apple spokesperson told The Verge that the plans haven’t changed after all. Additionally, you can also find a technical summary of CSAM in PDF. So, it looks like you are about to be under Apple’s surveillance – just not yet.