Historic condemnation of Meta and YouTube for their impact on mental health

  • A Los Angeles jury condemns Meta and Google (YouTube) for the addictive design of their platforms and orders them to compensate a young woman for damages to her mental health.
  • The rulings in the United States focus on the business model and the algorithms that generate dependency, not on the specific content that is published.
  • The European Union and Spain already have a legal framework (DSA, AI Regulation, protection of minors, data and defective products) that would allow claims for similar damages.
  • Experts warn of a systemic risk: increased depression, anxiety, eating disorders and political polarization among teenagers due to the intensive use of social media.

Social media and mental health

An sentence handed down in Los Angeles has set off alarm bells about the role of large digital platforms in the youth mental healthA jury has ruled that Meta and Google (owner of YouTube) designed their services to create addiction, forcing them to pay a multimillion-dollar settlement to a young woman who developed compulsive social media behavior during her childhood.

The case, far from being an isolated anecdote, has become a turning point in the international debateFor the first time, it is being pointed out so directly that the architecture of these platforms—and not just the content they host—can cause serious psychological harm. And the question many in Spain and Europe are already asking is obvious: could our courts follow suit?

A jury in California focuses on addictive design

Judgment against digital platforms

In March 2026, a Los Angeles jury declared Meta and Instagram, Facebook and Google-YouTube responsible for contributing to a minor's addiction to social media. The young woman, now 20, began using these platforms in childhood and ended up developing depression, anxiety, low self-esteem, and suicidal thoughts. The compensation awarded amounts to several million dollars, with rulings indicating up to six million in certain cases and three million per company in others.

The relevant aspect of the verdict is not only the amount, but its logic: the jury concluded that the applications were designed as veritable “addiction machines”The issue is not so much a specific video, photo, or comment that is being questioned, but rather a set of design decisions that push the user to stay connected for as long as possible, often without being fully aware of it.

Among the techniques mentioned are the Infinite scrolling, autoplay videosConstant notification systems and recommendation algorithms that bombard users with increasingly tailored content create a complex mix of factors. This combination makes setting reasonable time limits difficult and reinforces compulsive usage patterns, with a particularly strong impact on children.

Meanwhile, another jury in New Mexico convicted Meta of not preventing child sexual exploitation in their services and for concealing deficiencies in its protection mechanisms, imposing a fine of $375 million. Both decisions point in the same direction: tech companies can no longer take refuge solely in the idea that they “only host content” generated by third parties.

A business model under suspicion

Legal experts and specialists in digital law agree that the problem is not isolated, but structural. Emilio Guichot, Professor of Administrative Law at the University of Seville, summarizes the issue: social networks rely on algorithms that maximize usage timeThis creates a risk of widespread dependency, not only among minors.

These algorithms work like “invisible architects” of behaviorThey use intermittent reinforcement mechanisms, variable rewards (likes, comments, notifications of new followers), and extreme content personalization. All of this encourages the user to return to the application repeatedly, even when they are aware that it is detrimental to their well-being.

In the case of adolescents, this design is especially delicate. Several studies highlight that young people are more sensitive to social reward and external validationTherefore, the "like" dynamic, the beauty filters or aspirational videos can trigger constant comparisons, low self-esteem, and an obsession with fitting into impossible models.

The scale of the phenomenon is reflected in the scientific data: meta-analyses with samples of more than one million teenagers have detected a consistent association between problematic use of networks and symptoms of depression and anxietyThe effect is not devastating in each individual case, but its consistency and scope make it a public health problem.

Furthermore, the impact is not uniform. Among girls, highly visual platforms like Instagram or TikTok reinforce exposure to content focused on the body and physical appearance, which is linked to an increase in eating disorders (ED) and body dysmorphia. In boys, the focus shifts to YouTube or Twitch, with a greater risk of video game addiction, muscle dysmorphia, and rigid or aggressive models of masculinity.

Consequences in Spain and Europe: mental health and radicalization

In Spain, the effects have been noticeable for years. According to various studies, around one 40% of young people perceive that social media damages their self-esteemEating disorders now affect some 400.000 people, mostly teenagers, with a notable increase following the pandemic. Many specialists link this surge to increased screen time and the constant pressure to project a “perfect” image.

But the impact is not limited to the individual level. The same reinforcement dynamics that fuel addiction also foster information bubbles and political polarizationIn Spain, nearly 80% of young people between 16 and 30 years old consume political information mainly through TikTok, Instagram and YouTube, displacing traditional media.

In this environment, the most successful content isn't necessarily the most rigorous, but rather the most eye-catching, emotional, or extreme. Thus, communities like the well-known “manosphere” They take advantage of the popularity of videos about male success, gyms, or seduction to slip in misogynistic, anti-feminist messages and, little by little, openly authoritarian discourses.

The algorithms detect what hooks each user and provide them with more of the same, each time with greater intensity. The result is that many kids end up enclosed in ideological bubbles where hatred towards women, migrants, or minorities is normalized, and the far right is presented as the only way to regain “status” and “order.” This is not an accident, but the logical consequence of a system that rewards outrage and conflict.

This climate makes it difficult for formal education or traditional media to fulfill their counterbalancing role. When The algorithm replaces school and journalism As the primary source of information, the democratic capacity to form critical citizens suffers. Hence, many experts insist that the discussion about youth mental health is intimately linked to the quality of our public life.

The European framework: from the Digital Services Regulation to the AI ​​Regulation

The good news for those demanding change is that the European Union has already begun to take action. Digital Services Regulation (DSA)The regulation, in force since 2022, requires large platforms to identify and mitigate so-called "systemic risks," which expressly include negative impacts on mental health and on minors.

This goes far beyond simply deleting isolated pieces of content. The DSA requires independent audits, regular reports, and corrective action when issues are detected. service designs that cause structural damageAn infinite scrolling or autoplay system could be considered part of that systemic risk if it is shown to encourage addictive behaviors.

The new EU Artificial Intelligence Regulation It adds another layer of protection. Article 5 prohibits AI practices that, through subliminal or manipulative techniques, substantially alter people's behavior with the likelihood of causing significant harm. Many legal experts point out that this description fits, at least in part, algorithms that exploit psychological vulnerabilities to keep users engaged.

In parallel, the regulations on European digital identity It envisions a digital wallet that would enhance traceability without completely eliminating anonymity. The idea is to hinder practices such as harassment, fraud, and identity theft, maintaining a certain level of privacy while preventing anyone from lying about key information like age.

With this regulatory framework, the European Union now has the formal capacity to investigate, audit and sanction big technology companies with fines that can reach up to 6% of their revenue in the continent. The Los Angeles ruling is not legally binding in Europe, but it serves as a symbolic precedent that may encourage authorities and victims to use the tools already available.

Spain's position: non-substance addictions and the protection of minors

Spain is not starting from scratch. For years, its legislation has expressly recognized the “Non-substance addictions” linked to digital technologiesLaw 1/2016 on comprehensive addiction care includes among problematic behaviors the excessive use of social media and video games, and urges administrations to deploy prevention policies.

In addition, Organic Law 8/2021 on comprehensive protection of children and adolescents against violence requires the promotion of “secure digital environments” They are already collaborating with the private sector to prevent harmful content and contact. Many experts maintain that a design that fosters dependency in minors can hardly be considered a safe environment.

Organic Law 3/2018 on the protection of personal data and the guarantee of digital rights is another key element. The algorithms that personalize content to the extreme are based on a mass data collection to create very precise behavioral profiles. This raises an uncomfortable question: can we speak of “free and informed” consent when the interface is designed precisely to influence and condition the user's will?

Also on the table is the recent Directive 2024/2853 on liability for defective productsThis explicitly includes software and AI systems. If a platform's intrinsic design is shown to cause proven harm—such as addiction or psychological disorders—the service could be classified as a defective product, giving rise to direct civil claims against its developers.

In the most serious cases, some criminal lawyers even point to the possible application of the crimes against moral integrity of the Penal Code, when there is serious and deliberate harm to mental health, especially in vulnerable people such as minors. It wouldn't be a simple path, but it illustrates the extent to which the legal debate has moved beyond just the content to examine the heart of the business model.

What changes are planned for the design of the platforms?

Beyond the courts, in Europe and in Spain the idea of ​​intervening directly in the design features that enhance addictionAmong the measures being discussed are limiting or prohibiting infinite scrolling, disabling automatic video playback by default, and introducing clear warnings about accumulated usage time.

There are also plans to strengthen the effective age controls and restricting minors' access to certain applications, following the lead of countries like France, Denmark, Australia or Spain itself, which have publicly advocated raising the minimum age for using some networks to 16 years old.

The major platforms, for their part, have introduced parental control tools, timers, and options to limit notifications in recent years, but US courts have ruled that these measures are clearly insufficient versus a basic design aimed at maximizing permanence.

The fundamental question is whether companies are willing to give up some of their revenue model based on continuous care from users or whether they will try to maintain it with minimal adjustments. Business history is full of sectors that resisted changing harmful products until the very last moment, and only did so after a cascade of court rulings against them.

In this context, the conviction of Meta and YouTube for their impact on mental health is seen as a sign that the cycle of impunity may be reaching its limit. It remains to be seen whether Europe and Spain will leverage the existing legal framework to demand real accountability and force healthier product designs.

What is emerging clearly today is that Social media has ceased to be a simple, neutral tool. Communication technologies are evolving into infrastructures that shape emotions, relationships, political identities, and psychological well-being. The experience of US courts, combined with the European regulatory framework, marks a new era in which the mental health of users, especially minors, is beginning to take center stage in public and legal discourse.

Carolina Marín leaves social media
Related article:
Carolina Marín steps away from social media to prioritize her mental health

Follow us on Google News