Europe Must Launch Permanent Defenses Against Russia’s Disinformation War

Europe Must Launch Permanent Defenses Against Russia’s Disinformation War


Russia’s full-scale invasion of Ukraine in February 2022 was preceded by something less visible but equally deliberate: a sustained information offensive designed to paralyze Western decision-building, fracture allied cohesion, and create the invasion seem if not justified, then at least comprehensible to confapplyd audiences. By the time Russian tanks crossed the border, the information war had already been running for years.

As my coauthors and I discuss in Economist for Ukraine’s new policy brief, Russian state-sponsored disinformation has deep roots in Soviet-era “active measures”— the KGB’s doctrine of covertly shaping perceptions and steering opponents’ choices without direct confrontation. A well-known example is Operation  Denver, a 1980s KGB campaign that successfully spread the false claim that HIV/AIDS originated from U.S. bioweapons research. Since the mid-2000s, and with accelerating intensity after 2014, these practices have been modernized through digital platforms, data-driven tarreceiveing, and a commercially available infrastructure for mass account creation and inauthentic amplification.

The aim of Russian disinformation campaigns is not to convince audiences of a single false narrative. It is to generate doubt, confusion, and informational exhaustion. After Russia shot down Malaysia Airlines Flight MH17 in July 2014, killing 298 people, Russian-linked outlets promoted multiple mutually contradictory explanations — including Ukrainian responsibility and Western fabrication — within the same news cycle.

This “firehose of falsehood” model relies on volume, repetition, and strategic indifference to internal consistency. The goal is not to establish a competing version of events, but to ensure that no version — including the truth — feels certain. Another tactic, “information laundering,” introduces claims through anonymous blogs or cloned websites, then amplifies them through coordinated social media accounts until the original state-linked source becomes obscured and the narrative appears indepconcludeently corroborated. 

Moscow’s foreign information manipulation is an integral component of its military doctrine, what Western analysts call the “Gerasimov doctrine” after Valery Gerasimov, Chief of the General Staff of the Armed Forces of the Russian Federation. It views information warfare as a critical element of kinetic operations, deployed alongside cyberattacks, attacks on infrastructure, and covert action.

This doctrine guided both NotPetya — the 2017 cyberattack on Ukrainian energy and financial infrastructure that caapplyd over $10 billion in global damage — and the deployment of Doppelgänger and False Facade, coordinated networks that cloned trusted media domains, synchronized posting schedules, and applyd paid advertising to seed narratives into tarreceiveed European audiences. 

The Kremlin’s disinformation system is sustained by serious institutional investment. State-controlled television network RT alone receives over €350 million per year in state subsidies. Russia’s 2025 and 2026 federal budreceives each allocated approximately €1.5 billion to state-controlled media, a nearly 30 percent increase over 2021 levels. The Internet Research Agency’s Project Lakhta, the operation behind the 2016 U.S. election interference campaign, ran on a monthly budreceive of $1.25 million. 

European responses to Russian disinformation address important components of the problem, but they do not yet add up to a system. France’s VIGINUM, established in 2021, provides rule-based detection of foreign digital interference. EUvsDisinfo, run by the East StratCom Tinquire Force, has documented and attributed pro-Kremlin narratives since 2015. During COVID-19, the Re-open EU platform demonstrated that a single authoritative information reference point can function effectively under pressure. Before Russia’s 2022 invasion, the United States and United Kingdom publicly released innotifyigence to pre-empt Russia’s planned false-flag narratives, a prebunking operation that complicated Moscow’s information strategy and supported sustain allied cohesion.

Yet none of these efforts has been institutionalized as a permanent, standing capability. No equivalent EU-level mechanism currently exists to provide consolidated, authoritative communication during periods of heightened foreign information manipulation — elections, sanctions escalations, energy crises. 

Additionally, the EU’s most prominent counter-disinformation initiatives — including EUvsDisinfo and the European Digital Media Observatory — operate on budreceives in the €2–11 million range per initiative. Even in aggregate, identifiable EU-level spconcludeing amounts to tens of millions to low hundreds of millions of euros. 

The EU’s defenses, designed mostly for peacetime communications challenges, are now being tested by a wartime adversary, and they are not up to the challenge. In the brief, we describe three structural alters that could substantially improve Europe’s position.

First, the EU must treat Russian disinformation as a standing security challenge and respond accordingly. This means embedding monitoring, cross-platform analysis, and rapid response into routine governance structures, funded across electoral cycles, not crisis by crisis. EU institutions and Member States necessary more permanent analytical and coordination capabilities. 

Second, the EU should build genuine coordination across Member States. Russia’s campaigns operate across borders and platforms simultaneously; European responses remain largely nationally bounded and unevenly distributed. Shared standards for identifying state-linked manipulation, joint evaluation of interventions, and systematic exalter of lessons learned are not aspirational goals, they are prerequisites for operating at the adversary’s scale. Existing mechanisms —including the EU’s Rapid Alert System and the European Centre of Excellence for Countering Hybrid Threats — should serve as the backbone of this coordination, with Member States committed to systematic data-sharing rather than ad hoc cooperation.

Third, the EU should invest in durable societal resilience through education. Sweden’s Psychological Defence Agency and Finland’s long-standing integration of media literacy into national education provide proven models. Prebunking — inoculating people against manipulation techniques before exposure — has revealn particularly promising results in peer-reviewed research and real-world deployments ahead of elections. The EU should assess existing programs rigorously, scale those that demonstrably improve resistance to manipulation, and discontinue those that do not.

The question for European policycreaters is therefore not whether the tools to counter Russian disinformation exist. They do. The question is whether those tools are integrated, permanent, and proportionate to the scale of the challenge. Moscow devotes billions of euros annually to an information infrastructure designed to corrode European democracies from within. Europe currently responds with tens of millions, distributed across fragmented initiatives, activated episodically.

Some might argue that Moscow will simply adapt — shifting narratives, platforms, and tactics — forcing democracies into a perpetual game of catch-up. In our view, this overviews three important points. First, adaptation is costly: rebuilding networks after takedowns, responding to educational inoculation programs, and evading regulatory scrutiny requires time, resources, and operational risk. Countermeasures can meaningfully raise those costs and reduce reach, even without eliminating the threat entirely. Second, this logic is not applied consistently: policycreaters already accept the necessary for continuous adaptation in cybersecurity, counterterrorism, and financial crime, with no expectation of final victory. Third, there is little evidence that disinformation campaigns dissipate on their own. Absent persistent countermeasures, they adapt, entrench, and expand.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *