Instagram’s algorithm to blame for teen daughter’s serious eating disorder says lawsuit

Jun 17, 2022

Alex Baker

Alex Baker is a portrait and lifestyle driven photographer based in Valencia, Spain. She works on a range of projects from commercial to fine art and has had work featured in publications such as The Daily Mail, Conde Nast Traveller and El Mundo, and has exhibited work across Europe

Instagram’s algorithm to blame for teen daughter’s serious eating disorder says lawsuit

Jun 17, 2022

Alex Baker

Alex Baker is a portrait and lifestyle driven photographer based in Valencia, Spain. She works on a range of projects from commercial to fine art and has had work featured in publications such as The Daily Mail, Conde Nast Traveller and El Mundo, and has exhibited work across Europe

Join the Discussion

Share on:

The parents of a young woman afflicted with a severe eating disorder are blaming Instagram in a lawsuit. Instead of blaming the content, however, they are targeting the accusations at the platform’s algorithms in an attempt to skirt around the Section 230 code.

The lawsuit filed in California federal court alleges Instagram’s parent company Meta purposely crafted products to addict young users, steering one 11-year-old girl down a years-long path of physical and psychological harm.

The court case, brought about on behalf of now-19-year-old Alexis Spence, asserts Instagram “consistently and knowingly” targeted its product at young children while at the same time ignoring warnings internally about its worsening effects on the mental health of its users.

The parents argue that due to Alexis’ long and extended addiction to Instagram, she has had to receive extensive counselling, outpatient consultations, overnight hospital stays, attend eating disorder programs, and will likely have to have a service dog and continual mental health monitoring for the rest of her life.

The lawsuit directly cites the leaked internal Facebook report from 2021 which included confidential reports and presentations portraying Instagram as a blight on the mental health of adolescents.

The suit is also the most recent case hoping to find a way around the liability shield passed in 1996 given to website owners and operators under Section 230 of the Communications Decency Act. It effectively means that owners of a website where 3rd parties can post content are not responsible or liable for the effects of that content.

The online world is an entirely different landscape now than it was when that law was passed, and one could argue that the huge tech and social media companies have benefitted from that bulletproof law at considerable cost to not just their users, but to their moderators as well.

“There’s a concerted effort across the country to re-frame lawsuits against internet services as attacking their software tools, not attacking the content that’s published using them,” Eric Goldman, a law professor at Santa Clara University, told Gizmodo.

This is of course not the first lawsuit brought against Meta since the report was leaked. The sad stories of teens taking their own lives have filled the news with distraught parents lashing out at the social media giant, begging them to take more responsibility towards their younger users. The negative messages across social media have even become the object of awareness campaigns designed to counter the harm caused.

If successful it may become a landmark case because the lawyers are targeting Instagram’s algorithm rather than the specific content consumed. It’s been reported for some time now that the algorithm skews towards directing more of the same content to viewers and is deliberately designed to become psychologically addictive.

“Again, you’re talking about the algorithm and the way that the complaint may be framed is really more about the overall service, that everything about the service was designed to encourage usage and that encouraged amount of usage is what caused the problem,” Goldman says. “It doesn’t mean they’ll win, but they may have found a way to get around Section 230.”

As parents it’s difficult to always keep an eye on what your teens are watching. And it’s far more difficult now that so many have their own devices, with multiple channels available. Social media is such a new thing that for some it can be difficult to grasp the seriousness of how it can affect young pliable minds.

I was 26 years old when Facebook started, already an adult. I cannot fathom growing up in today’s online environment, or the distinct challenges that it poses. The mind of a 10 to even a 19 year old is still in the developmental stages and far more susceptible to manipulation than the adult brain. Parents cannot monitor their children’s every activity 24/7. It is time that the big tech companies stepped up and were held accountable as well.

Filed Under:

Tagged With:

Find this interesting? Share it with your friends!

Alex Baker

Alex Baker

Alex Baker is a portrait and lifestyle driven photographer based in Valencia, Spain. She works on a range of projects from commercial to fine art and has had work featured in publications such as The Daily Mail, Conde Nast Traveller and El Mundo, and has exhibited work across Europe

Join the Discussion

DIYP Comment Policy
Be nice, be on-topic, no personal information or flames.

Leave a Reply

Your email address will not be published. Required fields are marked *

3 responses to “Instagram’s algorithm to blame for teen daughter’s serious eating disorder says lawsuit”

  1. Chuck Diesel Avatar
    Chuck Diesel

    Ummm…. What about the responsibility of the parents? They bought her the phone, knowing what was she could be exposed to when they gave it to her. It would be the same as blaming a car company for teens that street race. Or another company for building a product that someone uses to commit a crime with.

    1. Sean Avatar
      Sean

      Love strawman arguments. Obviously you have NEVER dealt with an addict. Same principles and issues apply regardless of the addiction. Addiction is a physical AND mental reliance on something. Opium or Instagram, addiction is the same. Section 230 was primarily meant to protect forum sites where users post content, but the site does not promote the content only provide a platform. FB and Insta no longer fit that mold. THey do not just let users post content that is seen by those who choose to, they actually publish it by creating algorithms that push that content out to users based on whatever criteria they determine. That makes them a publisher and not protected by 230 if you follow the purpose it was created to address. Car companies don’t randomly make you race based on some demographic, they sell you a product, not a service. As for your second example, guns comes to mind. You won’t get addicted to buying cars or guns (well, not in the same sense).

      1. Chuck Diesel Avatar
        Chuck Diesel

        I have dealt with addiction before, but I learned from other people’s mistakes. I also knew that once I became an adult, I would be responsible for my own actions. In this case, the parents provided the vessel by with their child become addicted to something and due to their non-actions in their own child’s life, she became addicted. My kid didn’t because I was a responsible parent that didn’t allow my child access to FB and IG. And if they did get access to those platforms, I would not have the audacity to blame FB and IG for something that I, as the parent, had control over. So Sean, it sounds like you either have no children or think that the woes of society always come from a third party.