"Addicted": Young woman sues Instagram and YouTube

Published on: February 24, 2026Categories: Legal, Tech & E-CommerceReading time: 3 min.
Avatar photo
Hakan Tok writes articles on technical topics in the blog Recht 24/7 Love & Law.

Image: FotoField / Shutterstock.com

A 20-year-old woman is taking Instagram and YouTube to court in Los Angeles. Her accusation is explosive: she claims that the platforms are deliberately designed to make it almost impossible to leave. Endless feeds, autoplay, filters, recommendations—it's all like a digital slot machine. She says this has triggered depression, anxiety, and massive pressure on her own body image. Deutschlandfunk reports in detail on the case.  This is about more than one individual's fate: the lawsuit could become a litmus test for how far tech companies can be held responsible for their product design.

What the trial is really about

The plaintiff appears under the initials KGM. She was active on social media from a very early age, first on YouTube, then on Instagram, and later on Snapchat and TikTok. Her account: the constant scrolling and comparing drew her into a spiral of consumption, self-doubt, and dissatisfaction. Algorithmic recommendations and beauty filters in particular distorted her self-image.

Snapchat and TikTok are already out of the picture following a settlement. That leaves Meta (Instagram) and Google (YouTube)—two of the world's biggest advertising machines.

The sticking point: content or design?

In the US, platforms have a powerful shield: Section 230. Put simply, companies are usually not liable for what users post. This is precisely what corporations rely on—they say that if damage has been caused, it is because of third-party content, not because of their apps.

However, Judge Carolyn Kuhl did not simply let this pass. She made it clear that if the problem is not the individual post but the design, liability could be possible. In other words, it is not "what" is shown, but "how" it captures your attention. That is the door that is being opened here.

What Meta and Co. say in response

Meta argues that the young woman's difficulties had already begun before that. Furthermore, there is no diagnosis of clinical addiction. Mark Zuckerberg said in court that Meta no longer pursues goals to maximize usage time. Companies generally deny that social media is addictive. The accusation of specifically targeting children is also rejected, citing the minimum age of 13, although Zuckerberg admitted that there were occasional weaknesses in age verification.

Why this ruling could shake up the industry

Thousands of similar lawsuits are now pending in the US—from families, school districts, and states. If juries rule that design can cause harm and that companies can be held liable for this, the defense strategy employed by corporations to date will become untenable. Observers compare this to the long road taken by tobacco lawsuits: initially dismissed, later costly.

Critical commentary

The debate too often hinges on the question, "Should we ban this for young people?"—as if adults were immune. This seems convenient, but it is short-sighted. If a product only works because it systematically keeps people in a continuous loop, then it is not the user who is the problem, but the product. And then pointing fingers won't help—only design rules that bring about noticeable change will.

 

Sources: deutschlandfunk.de

Do you have similar problems with social media? Contact us now and get legal support!

At a fixed price of 169 EURO (gross)