Apple announced last week that it would start scanning your iPhones and iCloud for photos of child sexual abuse. The new feature sparked a lot of controversy and concern among both users and other companies. When asked whether it would apply the same technology, WhatsApp said it wouldn’t, while at the same time calling Apple’s move “surveillance.”
Head of WhatsApp Will Cathcart tweeted that Apple’s move is “the wrong approach and a setback for people’s privacy all over the world.” When asked whether WhatsApp would adopt the same system, he replied with a short “no.”
I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world.
People have asked if we'll adopt this system for WhatsApp. The answer is no.
— Will Cathcart (@wcathcart) August 6, 2021
In a rather long Twitter thread, Cathcart writes that everyone wants to see the abusers caught. However, he points out that there are different ways to contribute to it:
“We’ve worked hard to ban and report people who traffic in it based on appropriate measures, like making it easy for people to report when it’s shared. We reported more than 400,000 cases to NCMEC last year from WhatsApp, all without breaking encryption.”
Cathcart adds that Apple’s approach “introduces something very concerning into the world.”
“Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone,” Cathcart writes. “That’s not privacy.”
“This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable.”
There’s also a bit of Sinophobia the head of WhatsApp shows in his tweets, wondering if Apple’s technology will be used in China and what kind of content would be considered illegal there. But in the overly politically correct Western society, I believe that this technology could be misused in a variety of ways as well.
Matthew Green of Johns Hopkins Information Security Institute also expressed concerns about Apple’s new system, calling it “a really bad idea”. He warns that it could eventually become “a key ingredient in adding surveillance to encrypted messaging systems.”
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
— Matthew Green (@matthew_d_green) August 4, 2021
WhatsApp has been a part of Facebook since 2014. I find this pretty ironic considering that Facebook hasn’t exactly been known for taking care of its users’ privacy. Only instead of scanning their photos for child abuse, Facebook only leaked passwords of millions of Instagram and Facebook users, multiple times. Whoopsie.
On the one hand, we do need a variety of methods for tracking down child abusers and those who distribute photos and videos of such content. But on the other hand, this technology does sound concerning and there are plenty of ways it could be misused. In addition, there’s so much going on “under the surface” in the realms of the dark web, that whatever is found through iCloud and iPhone scanning is just the tip of the iceberg.
[via Gizmodo]
FIND THIS INTERESTING? SHARE IT WITH YOUR FRIENDS!