Brussels points to TikTok's addictive design and puts the EU on guard

  • The European Commission accuses TikTok of violating the Digital Services Act due to an addictive design based on infinite scrolling and hyper-personalized recommendations.
  • Brussels sees serious risks to the mental health and well-being of children and vulnerable adults, based on studies from several European countries and the WHO.
  • The EU demands that TikTok redesign key features (infinite scrolling, auto-play, notifications, algorithm) or risk a fine of up to 6% of its global revenue.
  • The company argues that the Commission's description is "false and unfounded" and announces that it will appeal as the debate on limiting minors' access to social media grows in Europe.

Addictive TikTok design in Europe

The tension between European institutions and major digital platforms has taken a significant leap forward, with the focus now on... addictive TikTok designAfter almost two years of analysis, the European Commission preliminarily considers that the social network violates the Digital Services Act (DSA) due to the way it structures its use and hooks users, especially younger ones.

Brussels maintains that certain design features Features like infinite scrolling, autoplay videos, push notifications, and a "highly personalized" recommendation system create an almost compulsive consumption pattern. “highly personalized” recommendation systemFurthermore, it contrasts with solutions that allow users adjust the Reels algorithm to limit automatic customization.

What does the EU mean by “addictive design” in TikTok?

The focus is not on a specific video or an isolated type of content, but on the global service architectureThe Commission describes a system in which every element—from the way you navigate the screen to how you choose which clips appear—is optimized to keep the user inside the application for as long as possible.

According to the statement from the European Commission, the infinite scroll It's one of the central pieces of that mechanism: the video stream never runs out, and with the swipe of a finger, a new stimulus always appears, without the need to search or make conscious decisions. This is combined with the Autoplay, which launches each video without the user having to click, further reducing the "natural pauses" that would allow you to disconnect.

Another key component is the push notificationsThese notifications arrive regularly to remind users of pending content, new followers, or trends to watch. According to Brussels, this steady stream of alerts encourages users to reopen the app repeatedly throughout the day, even during rest or study time.

Finally, the system of personalized recommendation It functions like the glue that binds all those pieces together. Based on viewing history and interactions, the algorithm prioritizes videos most likely to capture each person's specific attention, fueling the so-called "burrow effect": the more you consume, the more the content is tailored to your tastes, and the harder it is to break out of that pattern.

Infinite scrolling and addictive design on TikTok

Impact on children and vulnerable adults

The Commission's central concern is how this Addictive design affects physical well-being and mental health of the users. The report focuses particularly on two groups: minors and vulnerable adults, such as people with pre-existing mental health problems or with fewer resources to manage screen time.

According to the investigation, to “reward” constantly With new videos, the platform reinforces the need to keep swiping. This feeling of instant gratification can translate, according to the scientific literature cited by Brussels, into compulsive behavior that reduces self-control, increases viewing time, and displaces other activities, from sleep to studying.

EU sources point out that these are not isolated hunches. The Commission has reviewed studies from several member states and the World Health Organization. A French parliamentary report, for example, indicates that around 8% of children aged 12 to 15 spend more than five hours a day on such platforms. Another Danish study detects that eight-year-old children use short video networks for an average of about 130 minutes a day.

A third study, from Poland, emphasizes that TikTok is the most used platform after midnight by teenagers aged 13 to 18, with particularly heavy use at night. Brussels also highlights that the social network is very popular among children aged 7 to 12, with nearly one in three opening the app more than 20 times a day on average, including during school hours.

Based on that empirical evidence, Executive Vice President responsible for Technology Sovereignty, Security and Democracy, Henna Virkkunen, warns that Social networks addiction It can be especially harmful to “the developing minds of children and adolescents.” And it emphasizes that the DSA “holds platforms responsible for the effects they may have on their users,” with particular emphasis on protecting children in the European digital environment.

Minors and the risks of TikTok

A thorough, scientifically based European investigation

The preliminary findings did not emerge from nowhere. The Commission opened a formal proceedings against TikTok in February 2024, suspecting that the social network was violating several obligations of the Digital Services Act, including the management of systemic risks related to mental health and the protection of minors.

During this time, the European Commission has analyzed the internal risk assessment reports submitted by the company, as well as corporate data and documents. It has also processed multiple information requests to the platform to clarify how its recommendation system works, what indicators it monitors, and what mitigation measures it currently applies.

In parallel, the Brussels services have carried out a extensive review of the scientific literature The study focuses on behavioral addiction, compulsive social media use, and the effects of screen time on mental health, especially in children. To complete the picture, interviews were conducted with experts from various fields, ranging from psychology and psychiatry to data protection and digital security.

From this cross-referencing of internal documentation, data provided by TikTok, and independent studies, the Commission concludes that the platform did not adequately assess the impact of its addictive characteristics on the physical and mental well-being of users. Among other aspects, it would have ignored indicators such as the amount of time minors spend using the app at night, the frequency with which the application is opened, or access patterns that could alert to problematic behaviors.

Brussels emphasizes that these conclusions are, for now, preliminary opinions These findings do not prejudge the final outcome of the proceedings. However, they clearly define the playing field: if the company fails to convince the regulator that it can correct the identified risks, the case could result in a formal finding of non-compliance with significant penalties.

TikTok's measures under scrutiny: controls and time limits

One of the most delicate points of the European analysis is the assessment of the security tools TikTok claims to have these measures in place. The platform often highlights its parental control options, rest reminders, and screen time limit features, especially for accounts associated with minors.

However, the Commission believes that these measures They are neither reasonable nor sufficiently effective to offset the potentially addictive nature of the service's overall design. Regarding screen time management, community sources indicate that reminders are easy to dismiss and introduce little real friction to the user experience, so many users continue to spend long periods online despite warnings. As an example of regulatory pressure on practices targeting minors, cases such as the Netherlands sanctions Fortnite.

With respect to parental controlsBrussels believes these apps may not function as adequate safeguards because they require a level of involvement, time, and technical skills that not all parents can provide. From this perspective, the burden of protection cannot fall solely on families if, at the same time, the app's basic design continues to incentivize hyperconnectivity.

The European Commission insists that TikTok “appears not to apply reasonable, proportionate and effective measures” to reduce the risks he himself has identified as stemming from its addictive design. Hence, the current security layers are seen more as a complement than a fundamental correction to the platform's architecture.

In this context, the Commission notes that the DSA does not simply require transparency or visible buttons, but obliges large platforms to actively mitigate systemic risks which generate their own operating models, from misinformation to mental health problems associated with intensive use.

European Commission investigation into TikTok

TikTok's response and the role of the right to defense

TikTok's reaction to the accusations was swift. A company spokesperson described the Commission's findings as “a categorically false and totally unfounded description” of the platform, and has stated that the company will take all measures within its power to challenge them.

In practice, this means that TikTok will now be able to exercise their right to defense within the framework of the DSA procedure. Among other things, the social network has the possibility of examining the documents that make up the file, accessing the evidence on which Brussels bases its case and responding in writing to the observations made by the Commission.

Following this phase, the case will also be discussed in the European Council for Digital ServicesThe body that brings together the national authorities responsible for implementing the DSA. This forum will allow for a comparison of the European regulator's position with the experience and priorities of the Member States, especially regarding the protection of minors and the harmonization of safety standards.

Despite the firm tone of the EU authorities, the proceedings are not closed. Sources close to the investigation admit that it is “too soon” to determine exactly what design changes would satisfy the Commission. The company is open to proposing technical alternatives that mitigate the identified risks without completely altering its business model.

In any case, Brussels makes it clear that cosmetic tweaks will not suffice. The proposal is that the platform must demonstrate, with data and impact assessments, that any adjustments introduced will result in a real reduction in compulsive use and the potential harm to mental health, particularly in younger users.

Regulatory pressure and political debate in Europe

The case opened against TikTok is part of a European context of increasing pressure on the big platformsThe DSA, along with the Digital Markets Directive (DMA), was designed precisely to expedite regulatory responses to potential breaches, but experience shows that investigations remain complex and lengthy, especially when rapidly changing algorithms and the progressive integration of artificial intelligence come into play. Within this context, legal cases have also emerged concerning addictive practices in another sector, such as... Lawsuit against Fortnite for addiction.

Meanwhile, several member states have begun to make their own moves. In Spain, the Prime Minister, Pedro Sánchez, has announced his intention to prohibit access to social networks for children under 16 years old, a measure that is complicated to implement but reflects social concern about the impact of these platforms on children and adolescents.

Other European countries are exploring similar solutions. United Kingdom It is also studying preventing access to social networks for minors under 16, while France The deck sets the bar at 15 years. Outside of Europe, Australia has already moved in that same directionlegally restricting the presence of minors under 16 years of age in these types of services.

From Brussels, these national initiatives are viewed with some support, but also with caution. The Commission points out that the DSA aims to harmonize protection of European citizens and warns that states should not create additional obligations for platforms that clash with the common framework. The focus, it insists, should be on the actions of companies, not on making executives directly responsible for user-generated content.

Vice President Henna Virkkunen has gone even further, stating that there is a widely held opinion There is growing concern among EU governments—and also outside Europe—that major platforms “are not doing enough to protect minors.” The TikTok case, in this sense, would be just the first in a series of investigations, as the Commission is also working on inquiries into other companies such as Meta, which owns Facebook and Instagram.

European regulation on addictive design

What changes does Brussels demand to TikTok's design?

Beyond the diagnosis and the threat of sanctions, the Commission details some of the draft changes which he hopes to see reflected in the service. First, he asks TikTok to modify what he calls the “basic design” of its platform, paying particular attention to the features considered most addictive.

Among the options presented is the possibility of Gradually disable infinite scrolling Or at least, introduce clear limits that force users to make conscious decisions about continuing to watch video. Effective "screen breaks" are also mentioned, with real interruptions to the flow of content, particularly at night, when the impact on sleep and school performance can be greater.

Another line of action is to adapt the system of recommendations To prevent it from becoming an overly intense rabbit hole effect, this could translate into less aggressive algorithms that maximize viewing time, changes in how content is presented, or clearer options for users to regulate what type of videos they want to receive and at what intensity.

The Commission is not going into specific details of concrete solutions at this time, but it does make it clear that the responsibility lies with the company: it is TikTok that must propose solutions. mitigation measures that will allow regulators to verify that the risk of addiction is reduced. To this end, the European regulator expects to see regular assessments, data on nighttime use, screen time statistics, and analyses of how children respond to design changes.

Other elements on the radar are the push notifications and the frequency with which the app "pushes" the user to return. Limiting or reconfiguring these notifications, especially for teenage profiles, could be one way to reduce the constant pressure to open the app at all hours.

If the Commission ultimately considers that TikTok has not taken sufficient steps or that the proposed changes are merely cosmetic, it may issue a formal decision of non-complianceIn that scenario, the DSA allows for a fine of up to 6% of the company's annual worldwide turnover, an amount that, given the group's global revenue, would be several billion euros if the maximum allowed were applied.

TikTok and addictive design under the DSA

All this movement places the addictive TikTok design At the heart of a debate that extends far beyond a single application: in Europe, the idea is gaining ground that platforms not only distribute content, but also, through their architecture, shape how users behave, how much time they spend online, and what impact this has on their health. The outcome of this case will largely determine the regulatory model the EU applies from now on to any digital service that bases its success on keeping users glued to the screen.

TikTok business model-0
Related article:
TikTok's business model under scrutiny: addiction, algorithms, and regulatory challenges

Follow us on Google News