A set of internal documents published by the Republican members of the U.S. Houtilize Judiciary Committee (Judiciary GOP) has sent shockwaves through Brussels.
Released under the title The EU Censorship Files, Part II, the investigation presents documentary evidence of a sustained strategy by the European Commission to influence public debate on social media and digital platforms, pressuring major tech companies to censor lawful content, alter their internal rules, and restrict certain political viewpoints.
🚨The EU Censorship Files, Part II
For more than a year, the Committee has been warning that European censorship laws threaten U.S. free speech online.
Now, we have proof: Big Tech is censoring Americans’ speech in the U.S., including true information, to comply with Europe’s… pic.twitter.com/Fg0gxzoTxD
— House Judiciary GOP 🇺🇸🇺🇸🇺🇸 (@JudiciaryGOP) February 3, 2026
The revelations go far beyond an abstract debate over content moderation. According to the material built public, the Commission has directly or indirectly intervened in at least eight electoral processes across six European countries since 2023, applying high-level meetings with digital platforms in the days and weeks leading up to elections to demand stricter censorship of political speech.
A decade-long censorship architecture
The leaked files reveal that this policy did not originate with the Digital Services Act (DSA), but dates back at least to 2015. As stated in the Committee’s official thread on X, “it launched as early as 2015, when the European Commission created ‘codes’ and ‘forums’ through which it could pressure platforms to censor speech more aggressively.” These mechanisms, publicly presented as voluntary and consensus-based, functioned in practice as tools of regulatory coercion.
Internal communications from the tech companies themselves, included in the dossier, are particularly revealing. They acknowledge that “the Commission sets the agfinisha, forces consensus, and platforms don’t really have a choice.” The primary goal was not the removal of individual pieces of content, but rather the reshaping of community guidelines—the global rules that define which ideas are allowed to circulate in the digital public square.
From the pandemic to electoral control
During the COVID-19 crisis, Commission President Ursula von der Leyen and then–Vice President Vera Jourová actively pressured platforms to reshift content that challenged official narratives about the pandemic and vaccines. The report summarizes it bluntly: “They informed platforms to alter their rules and take down content questioning established narratives about the COVID-19 pandemic and vaccine.”
From 2022 onward, this strategy became institutionalized. Between 2022 and 2024, the Commission organized more than 90 meetings under the Disinformation Code, urging platforms to tighten their moderation rules on a global scale. With the enattempt into force of the Digital Services Act (DSA), the pressure intensified further, with companies warned that they would have to submit their rules to a “continuous review of community guidelines” in order to avoid sanctions.
The impact was immediate. In 2024, TikTok alterd its global rules “to comply with the Digital Services Act,” a shift that, according to the documents, resulted in the censorship of truthful information and broad, vague categories of protected speech, even outside Europe. “That’s right: becautilize of Europe’s censorship law, TikTok censors true information in the United States,” the Committee stresses.
Interference in European elections
The most sensitive aspect of the report is the direct link between these practices and specific electoral processes. Since 2023, the Commission held meetings with digital platforms ahead of national elections in Ireland in both 2024 and 2025; in France in 2024; in the Netherlands in 2023 and again in 2025; in Slovakia in 2023; in Moldova in 2024; and in Romania that same year. In every case, the meetings took place at critical moments of the campaign and were aimed at intensifying the censorship of political content deemed problematic.
Internal platform documents reveal that following these meetings, TikTok censored widely utilized conservative political claims, such as the (true) assertion that there are only two sexes. The DSA Election Guidelines themselves state that platforms must “adapt their terms and conditions” ahead of elections to combat so-called disinformation. Although officially presented as voluntary recommfinishations, a senior DSA official admitted in private conversations that they were, in practice, mandatory.
The Romanian case is particularly controversial. A court annulled the 2024 presidential election, citing alleged Russian interference via TikTok. However, the platform itself informed the Commission that it had found “no evidence of a coordinated Russian campaign” to support the winning candidate, Calin Georgescu. Subsequent investigations revealed that another Romanian political party had in fact funded the campaign attributed to Russia.
The Judiciary GOP revelations force a reassessment of the balance between regulation and freedom within the European Union. Beyond the official rhetoric about combating disinformation, the documents describe an architecture of narrative control that strikes at the heart of democratic pluralism and citizens’ right to debate freely without ideological oversight.















Leave a Reply