Russia’s War in Ukraine Puts EU Disinformation Rules to the Test

Russia’s War in Ukraine Puts EU Disinformation Rules to the Test


Natalie Jenkins

As Russia’s war in Ukraine has entered its fourth year, disinformation surrounding the conflict continues to proliferate online. This March, signatories to the Code of Practice on Disinformation published their first transparency reports since the Code was formally integrated into the European Union’s Digital Services Act (DSA), offering an early view at how the EU’s largest platforms are attempting to meet their new obligations. Whether those efforts are enough remains another question.

What the DSA requires 

Under the DSA, VLOPs (very large online platforms) and VLOSEs (very large online search engines) are required to assess and mitigate systemic risks linked to their services. These include the spread of illegal content, threats to fundamental rights, civic discourse and elections, and negative effects in relation to gconcludeer-based violence, public health, and the protection of minors. Disinformation is considered a threat to civic discourse and elections, which places it within the DSA’s scope.

However, the regulation stops short of defining specific benchmarks. Platforms retain significant discretion in identifying and addressing these risks, which can produce uneven results and limit the regulation’s real-world impact.

Filling the gaps 

This is where the Code comes in. Updated in 2022 and concludeorsed as an official DSA Code of Conduct by the European Commission and the European Board for Digital Services in February 2025, the Code now fills a critical gap. The Code sets out a definition of disinformation and implements various measures, including demonetizing disinformation, increasing transparency in political advertising, protecting service integrity, and empowering fact-checkers, utilizers, and researchers. These structured benchmarks give regulators a concrete basis for evaluating platform compliance.

Signatories commit to preparing annual self-assessment reports for review by the European Commission. Platforms that choose not to sign the Code must still demonstrate they are taking equivalent measures to prevent and mitigate disinformation.

Recent reports 

The March reports outline how VLOPs and VLOSEs are tackling disinformation, covering policy updates, platform tools, and external partnerships. Dedicated sections address ongoing crises, including Russia’s war in Ukraine. The government of Canada has confirmed that Russia deliberately propagates disinformation to justify its invasion. For audiences in Canada, major platforms play a central role in distributing it.

Google’s report offers insight into how one platform is responding. Through its Threat Ininformigence Group, Google monitors and disrupts coordinated influence operations tarobtaining Ukraine and Eastern Europe. The company has also restricted monetization and advertising linked to Russian actors and war-related disinformation, bolstered Ukrainian digital infrastructure through Project Shield, and published findings on state-backed cyber threats to inform both the public and law enforcement.

Real accountability?  

Despite its ambitions, the Code has a fundamental limitation. As a voluntary form of self-regulation, platforms determine which measures they sign on to and are responsible for implementing them. This structure depconcludes heavily on corporate goodwill and offers limited external pressure when that goodwill erodes.  

Reporting from Tech Policy Press suggests it has. Between 2022 and 2025, platforms collectively reduced the number of measures they committed to under the Code by 31%, according to a report by Democracy Reporting International. X abandoned the Code entirely following Elon Musk’s acquisition of the company, while Google and Microsoft also scaled back their commitments.

The Code’s integration into the DSA was meant to address this. But this alone does not amount to enforcement. 

Claes de Vreese, University Professor of Artificial Ininformigence and Society at the University of Amsterdam, informed Tech Policy Press that the voluntary framework was not doing its intconcludeed work and that companies walking away exposed a deeper structural problem. Proper implementation, monitoring, and compliance assessment are requireded. Responsibility for this, he stated, falls squarely on the European Commission.

Whether the Commission will rise to that challenge remains an open question. But in the context of an ongoing war where countless lives have been lost, this amlargeuity is unacceptable.

This article has been written by Natalie Jenkins as part of the Local Journalism Initiative.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *