Snap, Google and Apple sued for failing to protect kids from online predators

May 19, 2022

Dunja Djudjic

Dunja Djudjic is a multi-talented artist based in Novi Sad, Serbia. With 15 years of experience as a photographer, she specializes in capturing the beauty of nature, travel, and fine art. In addition to her photography, Dunja also expresses her creativity through writing, embroidery, and jewelry making.

Snap, Google and Apple sued for failing to protect kids from online predators

May 19, 2022

Dunja Djudjic

Dunja Djudjic is a multi-talented artist based in Novi Sad, Serbia. With 15 years of experience as a photographer, she specializes in capturing the beauty of nature, travel, and fine art. In addition to her photography, Dunja also expresses her creativity through writing, embroidery, and jewelry making.

Join the Discussion

Share on:

snap google apple sued

Yet another lawsuit has arisen against Snap, and this time, Google and Apple have been sued, too. A 16-year-old girl and her mother sued the three companies after a grown man, an active-duty Marine, had manipulated the girl into sending him nude photos since she was 12. The lawsuit claims that the companies failed to protect teen users from “egregious harm” and the spread of Child Sexual Abuse Materials (CSAM).

The lawsuit against Snapchat

As Insider reports, a class-action lawsuit was filed in a California federal court this week. The girl and the man were identified by initials: she is identified as L.W., whereas the predator is identified as B.P. According to The Washington Post which first reported on the case, B.P. first contacted L.W. on Instagram in 2018, just before her 13th birthday. After that, he asked for her Snapchat account.

Speaking to The Post, the teen said:

“Every girl has insecurities. With me, he fed on those insecurities to boost me up, which built a connection between us. Then he used that connection to pull strings.”

The man reportedly started asking for photos of L.W. in her underwear, then pressured and manipulated her to send nude videos. He also sent her explicit videos of himself. When she would refuse to get more explicit, she says that he would berate her until she complied.

The lawsuit reads that L.W. blocked B.P several times, but he kept contacting her through Instagram or via fake Snapchat accounts until she started talking to him again, the lawyers wrote. This has been going on for nearly three years. During this period, the two had reportedly exchanged hundreds of photos.

L.W. told The Post that she felt ashamed, but she was too afraid to tell her parents what was going on. Other than fearing their judgment, she was also worried about what he might do if she stopped sending him photos. “I thought this would be a secret,” L.W. told The Post. “That I would just keep this to myself forever.”

However, this wasn’t the case. B.P. used an app called Chitter to distribute the victim’s photo. This is why Apple and Google were sued as well, but more on that later. Finally, the girl gathered the courage to tell her mom what was going on.

The lawsuit against Apple and Google

The role of Apple and Google in this case is that they were hosting Chitter in their stores. According to the lawsuit, B.P. used this app to trade videos of the victim with other users. What’s more, he reportedly pressured other underage girls into sending him sexually explicit videos, which he also shared with other creeps.

According to The Post, both Apple and Google said they’d removed the app on Wednesday following questions from this media outlet. As for Chitter itself, its developers reportedly never responded to requests for comment.

As L.W.’s lawyers note, the girl has suffered major physical and mental harm “Due to the physical and psychological harms, L.W. was assessed at a teen suicide outpatient program, and even an emergency room after a suicide attempt,” the lawyers said. “She sought care from a personal therapist, psychiatrist, and was prescribed antidepressants.”

“The claims alleged in this case are not against the adult perpetrator,” the lawyers write. “They are against three major technology companies who enable him and others to commit these crimes.” The mentioned adult perpetrator was already convicted last year of charges related to child pornography and sexual abuse.

The lawsuit argues further that Snap takes a “reactive approach to protect teens from abuse.” In other words, it requires children to report their own abuse after it has occurred. As you can imagine, this is a huge burden for young minds. Because of that, cases like this often stay unreported on social media and platforms like Snapchat, Instagram, and the like.

Two other lawsuits were filed against Snapchat this year (alongside Meta’s Facebook and Instagram). They were not related to sexual abuse, but to the harm to teenagers’ mental health, following the suicides of two children.

[via PetaPixel]

Filed Under:

Tagged With:

Find this interesting? Share it with your friends!

Dunja Djudjic

Dunja Djudjic

Dunja Djudjic is a multi-talented artist based in Novi Sad, Serbia. With 15 years of experience as a photographer, she specializes in capturing the beauty of nature, travel, and fine art. In addition to her photography, Dunja also expresses her creativity through writing, embroidery, and jewelry making.

Join the Discussion

DIYP Comment Policy
Be nice, be on-topic, no personal information or flames.

Leave a Reply

Your email address will not be published. Required fields are marked *