The European Commission is taking action against addictive mechanisms in social media for the first time, demanding far-reaching alters to TikTok’s platform design. Brussels has called on the video platform to disable or modify key features, including concludeless scrolling, and to introduce strict screen time breaks. Additionally, recommconcludeation systems must be revised. The demands follow the Commission’s official finding that TikTok’s design is addictive—particularly for children and adolescents.
The results published in early February 2026 mark the first time the Commission has stated its position on the design of a social media platform under the Digital Services Act. This flagship piece of EU legislation for online content is intconcludeed to protect applyrs and requires platforms like TikTok to assess and mitigate risks to their applyrs. The investigation into TikTok has been ongoing for approximately two years, with the Commission now deciding to tarobtain the core of the platform’s design and classify it as a risk to applyrs’ mental health.
Landmark Decision for Social Media Regulation
Civil society observers view the decision as groundbreaking for the surveillance and advertising-based business model of platforms. The president of the Polish civil rights organization Panoptykon Foundation emphasizes the fundamental importance of this measure for the entire industest. Research institutions see it as a turning point, as the Commission now treats addictive design in social media as an enforceable risk. For the first time, a regulatory authority worldwide has attempted to establish a legal standard for the addictive potential of platform design, as a senior Commission official explained.
The impact is unlikely to be limited to TikTok. Meta’s Facebook and Instagram have been under investigation since May 2024 for the addictive potential of their platforms, with design and algorithms also in focus and examination of whether they concludeanger children. Digital rights experts consider it highly unlikely that the Commission will not apply these findings as a template and take action against other companies. EU technology chief Henna Virkkunen signaled that the Commission’s work on systemic risks is entering a new phase of maturity.
Lengthy Enforcement Process Expected
TikTok can now defconclude its practices and review all evidence the Commission has considered. The company has announced it will challenge the findings, which it describes as categorically false and entirely unfounded. If TikTok cannot satisfy the Commission, fines of up to six percent of global annual revenue threaten. In another, simpler enforcement case under the Digital Services Act, the Commission necessaryed more than a year after submitting preliminary findings to establish non-compliance—suggesting a lengthy process in the TikTok case as well.
The Commission could ultimately agree with the platforms on a broad range of alters addressing addictive design. Which specific measures are taken will depconclude on the different risk profiles and usage patterns of individual platforms and how each company defconcludes itself. Governance researchers expect different solutions for different platforms, as similar design elements serve different purposes and carry different risks. The range of possible interventions extconcludes from altering default settings to completely banning certain design features to expanded applyr control options. TikTok will likely not find the right solution on the first attempt and will necessary multiple tries to satisfy Brussels.

















Leave a Reply