EU flags in front of building
Publication | Report/Paper

Building Resilience Against Election Influence Operations

 Preparing for the European Elections in 2024 and Beyond

Executive Summary 

In 2024, nearly half of the world’s population is heading to the polls. One of the biggest of these contests is the 2024 European Parliament elections from June 6–9, in which 366 million voters in the union’s 27 member states will elect 720 members of the European Parliament (EP). This year’s elections are taking place amid increasingly shifting geopolitical and technological landscapes. Now is the time for European countries and their partners to review and adapt the tools in their arsenals to combat malign election influence operations. 

Understanding that countries can mount a strong defense of their democratic processes by continually learning from each other, this report offers snapshots of best practices for building resilience against election influence operations adopted by five different European countries—France, Sweden, Estonia, Bosnia and Herzegovina, and Ukraine—since the last EP election in 2019. The best practices include: 

  • Bolstering the government’s ability to identify and counter foreign influence operations 
  • Adopting whole-of-society frameworks to build resilience across institutions and sectors of society most vulnerable to disinformation 
  • Investing in fact-based journalism and digital literacy programs for communities most at risk of being targeted, such as minority language populations 
  • Developing proactive crisis communication plans for election authorities to anticipate and respond to false narratives 
  • Leveraging robust, cross-sectoral coordination, particularly on emerging technology, to counter malign influence. 

As autocratic actors continue to refine their influence efforts, it will be essential for European countries and others to adopt a similarly holistic approach. This includes working collectively across all sectors of society and learning from the experiences of others facing similar challenges. The five best practices described in this report are just a few ways European countries, and other countries facing similar threats, can bolster their preparedness and ensure resilience in the face of information-related threats ahead of future elections. How well European countries and others are able to do so will be critical to the foundation of democracy and electoral processes across the region for years to come. 

Introduction

In 2024, more voters are heading to the polls than ever before. Nearly half of the world’s population lives in countries that are holding national elections this year. The outcomes will have global ramifications for years to come. One of the biggest of these contests is the 2024 European Parliament elections from June 6–9, in which 366 million voters in the union’s 27 member states will elect 720 members of the European Parliament (EP). 

Many countries remain vulnerable to election influence operations—covert or overt efforts by foreign and domestic actors to circulate false, misleading, or harmful information or narratives to impact an election. Those actions aim to sow discontent and erode trust in democratic systems. Countries have varying capacity and preparedness to face these challenges, and there is no unified framework for meeting shifting threats to electoral integrity. As malign actors are increasingly coordinating and learning from one another, it is more important than ever for countries to learn from one another’s best practices to bolster their resilience against election influence operations. 

Although trust in democracy and trust in elections often go hand in hand, trust in democratic elections is precarious in many parts of the world. Domestic political polarization, the prevalence of false information online, and auto­cratic actors taking aim at democratic systems all make it harder for many voters to believe in the integrity of their elections. Foreign adversaries are ramping up the scale and sophistication of their tactics, becoming more brazen in their efforts to influence elections. The European Union (EU) Special Committee on Foreign Interference warned that foreign interference and disinformation, particularly by the Russian Federation and the People’s Republic of China (PRC), are likely to “continue in ever-greater numbers and become more sophisticated in the run-up to the European Parliament elections.” Adding fuel to the fire, the rise of easily accessible artificial intelligence (AI) tools exacerbates vulnerabilities that malign actors could exploit to undermine future elections. The PRC is already report­edly experimenting with AI tools to conduct campaigns to amplify societal division in the United States before its 2024 presidential elections. Other adversaries, such as Russia and Iran, could follow suit to influence other elections. 

These challenges are compounded by social media companies’ struggles to safeguard their platforms from informa­tion threats. Over the past year or more, a number of social media companies—notably X (formerly Twitter)—have rolled back some of their most constructive measures to uphold election integrity online. This includes disabling functions to report false election information, dissolving election integrity teams, and overhauling account verifica­tion services. Some companies have announced positive steps in response to widespread concern about election disinformation in 2024. These include Meta, which recently set up a team to tackle disinformation and AI abuse in the lead-up to the EP elections. However, others have not. These challenges, in addition to the introduction of newer, less tested platforms and tools such as Telegram, TikTok, Threads, and ChatGPT, increase concerns over whether information manipulation—obtaining and sharing information to disrupt democratic decision-making— could influence or lead to the disenfranchisement of voters ahead of the EP and other elections. 

On the positive side, many countries increasingly recognize the importance of bolstering their resilience to informa­tion manipulation and safeguarding electoral integrity. In August 2023, the EU adopted the Digital Services Act. This sweeping legislation aims to foster safer online environments by holding technology companies accountable for monitoring and removing harmful content from their platforms. In March 2024, the EU enacted the first framework on AI. Moreover, new EP rules on transparency and targeting of political advertising are intended to make election and referendum campaigns more transparent and resistant to interference. In addition to measures focused on the information space, the EU has protections to ensure the successful administration of the 2024 EP elections. For example, most of the EU’s 27 national governments retain paper records of each vote, observe strong chain of custody procedures, and count votes manually. These critical safeguards enable verification of election results when skepticism or errors occur. 

With relatively little time before many of the largest remaining 2024 elections, including the EP elections, this is not the time for most drastic changes. However, countries with upcoming elections can continue to make continual improvements until election day. For the future, countries should review and, if necessary, bolster their strategies to prepare for and counter harmful election narratives. Democracies can and do learn from each other about how to respond to threats and build trust among their populations. For example, EU member states could further bolster their information ecosystems ahead of the 2024 EP elections by looking at how other European countries have confronted similar issues. 

This report offers snapshots of some best practices for continually building resilience against election influence operations adopted by five different European countries the last EP election in since 2019. The five case studies on the following pages—France, Sweden, Estonia, Bosnia and Herzegovina, and Ukraine—include some members of the EU and some non-members. For each of the five case studies, the paper will first showcase how the best practice bolstered the country’s resilience against information manipulation ahead of a recent election. Each case study will be followed by broad guidance to help other countries adopt some or all of these practices into their own contexts, with the understanding that no two countries are alike. While there is no “one size fits all” approach, and the best practices presented may not work for every country, those looking to strengthen the integrity of their information environments for future elections should consider the lessons discussed below and how to incorporate them into their own operations. 

This paper is not directed at any one country or election. Nor is it intended to cast doubt on the integrity of upcoming elections, particularly those, like the 2024 EP elections, that have a history of ensuring that the results reflect voters’ choices. Instead, it serves as a reminder that election information defenses must evolve continuously, and it provides ideas for how countries can do this in the short and long-term. 
 

Table of Contents

Text
  • France: Bolster the government’s ability to identify and counter foreign influence operations 
  • Sweden: Adopt a whole-of-society framework to bolster resilience to withstand information manipulation  
  • Estonia: Invest in fact-based journalism and other disinformation resilience programs for segments of society that are most at risk of being targeted by influence campaigns 
  • Bosnia and Herzegovina: Adopt a crisis communications strategy to preserve trust in the face of false and misleading information campaigns
  • Ukraine: Leverage robust, cross-sectoral coordination, particularly on emerging technology, to counter malign influence 
     
Text

Recommendations

About IFES

About ASD