Instagram’s algorithms are actively promoting child sex abuse rings, studies have reported. Networks of paedophiles are using the network to commission and share illegal content, as well as grooming new victims.
The Wall Street Journal and academics at Stanford University and the University of Massachusetts Amherst conducted a joint investigation into the issue. The results have revealed a dark underside to the popular social media network run by Meta.
The researchers discovered that hashtags were being used openly to promote and share material. Menus of content are then available behind thinly veiled public feeds, which are then sold privately. The content includes explicit videos and images of underage teens, self-harm, and even bestiality. Much of the content is self-generated user content, the report notes.
The recommendation algorithms that Instagram uses immediately found more related accounts and content once the researchers started following these hashtags and accounts. It’s a similar issue previously reported about the app pushing more and more harmful content into the paths of already vulnerable users. However, this time, it’s taken on an even more sinister role.
Since the report came out, Meta has responded, saying that they have set up a task force to investigate and prevent this from happening further. “We’re continuously exploring ways to actively defend against this behaviour, and we set up an internal task force to investigate these claims and immediately address them,” said a spokesperson from Meta.
The EU has warned Meta that it will impose serious sanctions on the social media parent company if it fails to fix these child protection issues.
#Meta’s voluntary code on child protection seems not to work.
Mark Zuckerberg must now explain & take immediate action.
I will discuss with him at Meta’s HQ in Menlo Park on 23 June.
After 25 August, under #DSA Meta has to demonstrate measures to us or face heavy sanctions. pic.twitter.com/jA25IJH8Dp
— Thierry Breton (@ThierryBreton) June 8, 2023
However, this is not an isolated issue. The Guardian reported in April that Facebook and Instagram were used for child sex trafficking and that Meta was doing nothing to stop it. Additionally, when content was reported, it somehow managed to slip past the moderating procedure and given a green light. Instagram, it seems, is more concerned with censoring adult female nipples than protecting children.
Alex Stamos, head of Stanford’s Internet Observatory and former chief security officer for Meta, told the WSJ that the company can and should be doing more to tackle this issue. “That a team of three academics with limited access could find such a huge network should set off alarms at Meta,” said Stamos. “I hope the company reinvests in human investigators.”
If you didn’t already have a reason to jump ship from Instagram, perhaps this is it.
[Via The Verge]
FIND THIS INTERESTING? SHARE IT WITH YOUR FRIENDS!