Sunday, January 25, 2026

Hybrid Warfare during Elections: Is it possible to remain independent?

Sdílet

Elections
Photo: Element5 Digital on Unsplash

Introduction

The issue of hybrid warfare has emerged as a critical topic in contemporary international relations. In recent years, there have been numerous instances where foreign countries, typically large and influential ones such as Russia or China, have intervened in elections to expand their geopolitical influence in those countries.

From an electoral perspective, the last year was historic, often considered as a “super year”. More than 50 countries worldwide held elections in some form. The total population of these countries was approximately 4.2 billion people. In the USA, there was a race for the White House between Donald Trump and Kamala Harris (after Joe Biden withdrew). The world’s largest democracy, India, held general elections. And so did others, such as the European Union, Mexico, Bangladesh, Taiwan, etc. Such a large number of people going to the polls poses a threat to democracies and an opportunity for the major geopolitical actors (Russia, China…).

Like everything else in this world, methods of election interference evolve. Historically, forms of election interference have involved physical activities such as intimidation, burning ballot papers, and tampering with ballot boxes. Today, the methods used to achieve the goal are fundamentally different. Attackers target social networks like TikTok, Facebook and YouTube. Furthermore, they are establishing websites specially designed to spread fake news and propaganda. This shift is alarming, and we should be concerned not only because of the correct counting of votes to determine the winner or to obtain accurate vote shares, but even more so because of the damage or even destruction of trust in the fairness of elections and, thus, in democracy itself.

This paper is structured into four main sections. First, we will look at and analyse the history, evolution, and goals of the attackers. The second part will focus on the methods and tools used in hybrid warfare and hybrid attacks. The third section will provide a comparative analysis of case studies, mentioning some cases where hybrid warfare played a significant role, whether successfully executed or, fortunately, detected in time. And finally, the paper will focus on and discuss current threats and dangers, as well as potential solutions for protecting democracy, and, if possible, how we can ensure election independence.

This leads to the main paper question: In this age of digital connectivity, Artificial Intelligence and an immeasurable amount of information to process, is it possible to ensure truly independent elections, or must we resign ourselves to the fact that every election is a battlefield?

1.    Theoretical background and historical evolution of hybrid actions

1.1   What is hybrid warfare?

To understand the current threats to electoral processes, it is essential to define the term “hybrid warfare.” For this purpose, I will cite one of the leading theorists in this field, Frank G. Hoffman. In his work named Conflict in the 21st Century: The Rise of Hybrid Wars, he defines hybrid wars as follows:

“Hybrid Wars can be conducted by both states and a variety of nonstate actors. Hybrid Wars incorporate a range of different modes of warfare, including conventional capabilities, irregular tactics and formations, terrorist acts including indiscriminate violence and coercion, and criminal disorder.” (Hoffman, F., 2007, P. 14)

Hybrid warfare is therefore not just one type of attack, but a combination of many modes used together. It does not have to occur solely on the physical battlefield, but also, as mentioned previously, on the internet, in the media, and in other domains. Furthermore, these attacks do not originate solely from states; numerous non-state actors are also attempting to maximise the benefits of these attacks.

Frank G. Hoffman also adds:

“These multi-modal activities can be conducted by separate units, or even by the same unit but are generally operationally and tactically directed and coordinated within the main battlespace to achieve synergistic effects. The effects can be gained at all levels of war.” (Hoffman, F., 2007, P.29)

That means that attackers do not act randomly. Disinformation and attacks on digital infrastructure are often coordinated by a command centre, allowing their effects to be multiplied (synergy). If we relate this to elections, we can see this type of synergy, for instance, in the simultaneous dissemination of false information and fake news, as well as the spreading of hatred among people (there are many other examples, of course), to undermine trust in democracy. Although this work dates back to 2007, it remains an accurate description of hybrid warfare.

Very important to say that hybrid attacks very often lie in a controversial area somewhere between routine state policy and open warfare. This area is known as “the grey zone”. Attacks are intentionally carried out in such a way as to remain within the borders of the law or at least below the threshold of a military reaction. While these attacks are often damaging for the state, it is not a reason, for instance, to activate Article 5 of the NATO Treaty. Attackers thus achieve strategic successes with low effort and minimal risk.

However, discussing elections in the context of warfare can be somewhat misleading. There aren’t usually any tanks, soldiers or aircraft involved. François du Cluzel, head of innovative projects at the NATO Allied Command Transformation Innovation Hub, describes a new form of warfare known as Cognitive Warfare. In the article from NATO Innovation Hub – Cognitive Warfare, he states:

“Cognitive Warfare is therefore the way of using knowledge for a conflicting purpose. In its broadest sense, cognitive warfare is not limited to the military or institutional world. Since the early 1990s, this capability has tended to be applied to the political, economic, cultural and societal fields. Any user of modern information technologies is a potential target. It targets the whole of a nation’s human capital.” (du Cluzel, F., 2021, P. 6)

Attackers, therefore, are not aiming at the state’s infrastructure, like roads or factories, but at the brain’s infrastructure. Their goal is to, in some way, change people’s thinking or lead them toward a specific idea. These days, it is very crucial during elections because even though individual opinions may seem unimportant, their collective decision in the polls can change the geopolitical orientation of the entire country, and the impact on state sovereignty can be devastating.

And what does cognitive warfare look like in practice? For this, I will refer again to some lines from du Cluzel:

“Cognitive warfare pursues the objective of undermining trust (public trust in electoral processes, trust in institutions, allies, politicians…), therefore the individual becomes the weapon, while the goal is not to attack what individuals think but rather the way they think.” (du Cluzel, F., 2021, P. 8)

This means the goal is to change people’s perception of reality before they go to the ballot boxes. Attackers often exploit emotions, such as anger and fear, combined with an overwhelming amount of information. This leads to confusion about the target, as they do not know what to believe. In contrast to propaganda, which was commonly used in the 20th century (but it is still used in some countries), cognitive warfare does not necessarily try to claim, “we are the best”, but, on the contrary it spreads messages like “they are all bad, they don’t like you, corruption is everywhere, do not vote”. Thus, the goal is to spread social passivity and hatred among the population.

1.2   History of hybrid warfare

The effort to influence the enemy is as old as war itself; instead of spies targeting the highest echelons of government officials, social networks and the internet now focus on ordinary citizens. Even though technology changes, the essence remains the same. The current conflicts are not state-versus-state fighting with military strengths, but rather more complicated, targeting the human mind. Now, let’s take a look at some history and evolution of methods used in hybrid warfare.

J. E. Leonardson, in his review of books Active Measure by Thomas Rid and Information Wars by Richard Stengel, states:

“State-sponsored covert influence operations (…) began a century ago, pioneered by the nascent Bolshevik regime.” (Leonardson, J. E., 2020, March, P.1)

The state-funded operations are therefore not new. We can trace their roots back a hundred years, to the time of the Bolshevik regime. In the book Active Measures, the concept of a “modern era of disinformation” is introduced. This era is then divided into four main waves.

The first started forming in the early 1920s, so in the interwar years. In this wave, “The Trust” – a Soviet organisation that existed, or was believed to exist, for five years – operated.

“The story involves revolutionary Communist spies, exiled royal insurgents, love, exortion, kidnappings, mock and real executions, a fake book (…)” (Rid, T., 2020, P.14)

The Soviet Čeka, predecessor of the KGB, used these “weapons” to destroy the opposition in exile. As mentioned earlier, it marked the beginning of an era of state-funded, secret, and influential operations.

The second wave came after World War II. Disinformation became more sophisticated and professionalised. Names of the acts were different. In the USA, the CIA called it “political warfare”. This designation is a very broad term. In contrast, the Eastern Bloc introduced the more precise term “disinformation”, which is commonly used today. Whatever the name, the goals of both sides were the same: to increase tension in the nation of the adversary by leveraging facts, misinformation, or a combination of both.

The third wave starts in the late 1970s. The disinformation became very sophisticated and well-resourced, administered by a bureaucratic apparatus. The Soviet Union gained the upper hand and became increasingly influential. However, everything in excess is harmful, and the Soviet Union ultimately collapsed, taking much of its ideology with it. That does not mean today’s Russia has stopped spreading disinformation. On the contrary.

With the fourth wave, we get to the 21st century. Disinformation has been reborn and reshaped by modern technologies and the internet, reaching its peak in the mid-2010s. The old trend of slow, sophisticated influence had given way to a high-tempo approach with a lack of professionalism. Everything was fast, and there was too much information to process. It became more effective and less measured, so the protection became weaker and weaker.

1.3   Motivation and strategy of the aggressors

The goals of attackers are usually connected with influence. Authoritarian regimes perceive a strong and unified democracy as a threat. They aim to destabilise and undermine trust in political institutions, influence public opinion, or erode trust in democratic processes or democracy itself. They also want to polarise society into two parts that do not communicate with each other and blame each other for everything bad, thereby paralysing the state and making it impossible to make effective decisions, not only in foreign policy.

One of the most common ways is associated with the elections. These prominent political actors, mentioned in the introduction, attempt to put forward their candidate for election, which will promote a policy of these powers, thereby gaining more influence and enabling them to trade with this country under better conditions. This candidate can also cancel the sanctions against this power or sabotage the alliance’s effort.

It is important to note that this candidate does not have to originate from the country attempting to influence the target. It can be a candidate who, for instance, tries to promote an isolationist policy, therefore does not want to cooperate with other countries. This approach we can see nowadays in the USA, named “America First”. The USA withdraw their forces from Europe and other parts of the world back to America and concentrates them there. Europe, therefore, is weakened, and other powers, such as Russia, can act more freely.

Even if this candidate does not win, at least part of the goal can be achieved – that is, the separation of society into two big halves fighting against each other. This leads to distrust in the state and the electoral process itself.

“It’s not news that terrorists and dictators can spread their lies faster and more effectively than ever (…) Stengel believes the US government as a whole has sunk into paralysis and unable to mobilize even a fraction of its vast resources to fight an information war. Hostile messages move too fast and too nimbly—trolls at the IRA do not have to coordinate their tweets with various desks or agencies before releasing them” (Leonardson, J. E., 2020, March, P.4)

2.    Types of hybrid attacks

In contrast to the theoretical part, this chapter will focus on the practical one. It will focus on the types of hybrid attacks, their usage, the process of employing them, and their outcomes. It is essential to note that the attackers do not rely solely on one type of attack, but rather a combination of them to achieve a more powerful effect. This chapter will consist of three parts: Information Tools, Cyber Tools, and Technological Tools.

2.1   Information operations

This is one of the most visible parts of hybrid warfare. For this chapter, it is necessary to explain three similar but not the same terms. The first of them is Misinformation – it is a situation where false information is shared, but no harm is meant. Then there is Disinformation – this is a situation where false information is knowingly shared to cause harm. And the last of those terms is Malinformation – a situation where genuine information is shared to cause harm, often by disclosing information intended to remain private in the public sphere. Hybrid warfare relies on the fact that ordinary people will mainly share disinformation, believing it to be true, and inadvertently spread misinformation.

One of the models, used in Russia, is “Firehose of Falsehood”. The RAND Corporation describe this term like this:

“We characterize the contemporary Russian model for propaganda as “the firehose of falsehood” because of two of its distinctive features: high numbers of channels and messages and a shameless willingness to disseminate partial truths or outright fictions. In the words of one observer, “New Russian propaganda entertains, confuses and overwhelms the audience.”” (Wardle, C., & Derakhshan, H., 2017, September 27)

Russian propaganda is therefore produced at an extensive tempo, and it is distributed via a large number of channels. It is hidden in texts, videos, images, or audio shared on the internet, social media, as well as in national broadcasts and radio. Because this propaganda is disseminated quickly, it is more likely to be accepted as a fact. This is because the human mind tends to accept the first information it receives, and trustworthy media take the time to clarify the information.

The overload of this disinformation compounds this effect, so the true and clarified ones are lost in this mesh. For this model, the armies of “trolls” are paid and concentrated. In the past, these groups consisted of real people, but nowadays, with the rise of Artificial Intelligence, these real people are being replaced, or more accurately, supplemented by bots. Their goal is to share posts, comments, or create followers and views to create the impression that there are many times more people who follow these opinions than there really are.

2.2  Cyber operations

In this chapter, two methods will be described. The first is called “Hack-and-Leak”. In this method, hackers or a group of hackers use cyber tools to gain access to sensitive or secret material. It can include secret documents, photos, videos, and other resources that could cause harm to the target. After obtaining these resources, attackers can take multiple actions. They can sell this information to generate money, use it for manipulation and blackmail, or, as we will focus on later, use it to manipulate public opinion and influence potential election polls.

The second method is broader and more immoral. It is called “Tainted Leaks”. In this approach, attackers use the information they obtain, but they edit or add to it to alter its meaning. This method was used in the past, as is reported by J. E. Leonardson:

“Soviets’ most successful information operations did not rely exclusively on lies or forgeries. Instead, they generally were built on foundations of truth that lent credibility to small amounts of added-on falsehoods.” (Leonardson, J. E., 2020, March)

The goal of these changes is to discredit the person being targeted. It is more dangerous than the first-mentioned method, because there are evident lies, and it is hard to refute them.

2.3   Technological operations

This is a critical topic in today’s era of Artificial Intelligence. Disinformation campaigns created by Artificial Intelligence have become a regular feature of today’s world. One of the most dangerous types of methods used for hybrid attacks is Deepfakes. Deepfakes are AI-generated images, text, or audio that appear to be real and are created to manipulate people. It is challenging to determine whether the media or texts are genuine or if AI generates them. This method is not limited to interstate hybrid warfare; it can also be encountered in our everyday lives.

This tool is frequently used by fraudsters to create videos featuring trustworthy politicians, athletes, or others, in which these celebrities claim to have made investment platforms. Many people lost their money as a result. When we associate this tool with elections, we can mention politicians talking about topics such as wars, mandatory military service, raising taxes, etc.

3.    Case studies of hybrid warfare during elections

In this chapter, we will connect the theory with practical examples of three different cases. The first is the most significant affair in the world, because the target was one of the most influential countries, the USA. The second example of hybrid warfare during elections is in the heart of Europe, Slovakia. There was a view-changing incident, where Deepfake audio was created and shared among the people. The last example is the repeated presidential elections in Romania. This radical step of repetition is an example of how to deal with the hybrid attacks, in this case, with a massive campaign on social networks.

3.1   Presidential elections in the USA 2016

The most significant affair of hybrid warfare during elections is undoubtedly the 2016 presidential election in the USA. Many reports do declare that there were evident Russian attacks on these elections. 12 Russian military officers were accused of interfering in these elections. They were hacking the computers of U.S. persons involved in the polls, stealing secret documents and releasing them after, influencing the elections. Two of the Russians were also accused of hacking into computers related to U.S. persons responsible for the administration of elections and U.S. companies that supplied software related to the administration.

U.S. intelligence agencies have also found that the Russian government hacked directly into the email system of the Democratic Party to boost the Republican campaign effort. There was much disinformation on social media and attempts to compromise the voting system of all 50 states in the USA. According to the Senate Intelligence Committee Report, the GRU, FSB and SVR (Russian intelligence agencies) were behind these operations.

This is an example of the “Hack-and-Leak” method. It is difficult to determine whether these attacks had any impact on the final election’s results, but they undoubtedly had an effect on U.S. and global security. Because it was seen as a threat to liberal democratic structures, many U.S. policymakers tried to mobilise significant resources in response. They improved cybersecurity protection and increased security during the election.

The goal of these attacks does not have to be just the support of one of the candidates; it can also be undermining the trust of democratic processes in the USA and the separation of Americans.

3.2   Slovakia and Deepfake audio

Now we will look at the “Slovak case”. In the 2023 parliamentary elections, we witnessed what may have been the first use of Deepfakes in elections. It is now widely seen as the “dawn of a new era of disinformation” and a “test case”. Will this soon occur in more countries around the world? And what actually happened?

In 2023, parliamentary elections were held in Slovakia. The main favourites were Michal Šimečka, a member of the PS (Progresívne Slovensko), an EU-supported political party, and Robert Fico, a member of Smer, a political party that opposes sanctions on Russia, criticises the European Union, and is against military support for Ukraine. According to the predictions, their preferences were similar. In some, PS was first (with an average of 18.68 %), while in others, Smer was first (with an average of 19.10 %).

The turnaround came just two days before the elections. An audio clip capturing a pro-European candidate and Michal Šimečka, the leader of the PS, appeared. He was talking with one of the prominent journalists about electoral fraud. The clip went viral really fast, although both of them denied its authenticity.

All these cases have the most significant impact right after they are realised, and so does this one. Its effect was multiplied not only because it was released just two days before the opening of the voting rooms, but also because it was released during the “silence period” – the time before the elections, when the media are prohibited from discussing election-related themes. The final results of the polls were 22.94 % for the winning party, Smer, and 17.96 % for the PS.

After this case, several observers have called it proof that images, videos, or audio can no longer be trusted as evidence. This statement is somewhat exaggerated, but it highlights the seriousness of this case. Again, the result of this audio was not just a win for Robert Fico’s Smer, but also eroded public trust in institutions, the media, and democracy. The society was also separated and angered.

3.3   Repeated elections in Romania

In 2024, elections were held in Romania, which ended in an annulment and were subsequently repeated. This decision was historic, radical and drastic. It came after rapidly developing information about state-sponsored interference in the electoral process and hybrid activities coming from Russia. As a reason, they cited concern for the integrity of the votes, as one of the candidates had allegedly benefited from unfair promotion.

The first round of the 2024 Romanian presidential election took place on November 24, 2024. The prognosis favoured Marcel Ciolascu with almost a quarter of all votes. He was the prime minister of Romania until he resigned after the final presidential election in May 2025. Right after him was Elena Lasconi, whose preferences rose over time to nearly 20 %. She was considered a liberal pro-Europe candidate from the parliamentary opposition. There was also a populist, hard-right candidate, Călin Georgescu, with preferences ranging from 5 to 8 %.

Then came the election day. And the entire nation witnessed something truly remarkable. At the end of the elections, the winner was the “dark horse” Călin Georgescu, with 22.94 %. Elena Lasconi and Marcel Ciolacu received 19.17 % and 19.14 % of the votes, respectively. Of course, predictions can make mistakes, but such a significant difference is unusual, so the investigation began. On December 6th, the Romanian Constitutional Court made a radical decision: it cancelled the presidential elections. Just two days before the second, final round.

“The decision to annul the first round of Romania’s presidential election revolves around declassified documents from the country’s intelligence services that allege that a coordinated campaign promoted pro-Russian candidate Georgescu to unexpectedly garner the largest percentage of the vote on November 24.“ (Atlantic Council experts, 2024, December 6)

The SRI (Serviciul Român de Informaţii) also reported that nearly a million euros were spent on the campaign, with almost 950 euros paid for a repost. The TikTok platform itself admitted to receiving 362,500 euros from someone unknown. They also stated that they deleted 66,000 fake accounts and a significant number of fake followers targeting a Romanian audience. A huge, previously inactive network was operating on TikTok, Facebook and Telegram, only to be activated just two weeks before the elections. The network’s operators were hired and coordinated through a Telegram channel. They used hashtags associated with Călin Georgescu, gaining him significant visibility and popularity on TikTok.

This was an attack on the ally of NATO, and its presidential elections were almost stolen by foreign intervention. Romania’s democracy luckily proved itself to be strong enough and resilient to prevent this intervention from happening. This case demonstrated the power of social media and its profound impact. Another thing is that we can protect ourselves from it, but at what cost? It is never a good thing when elections are nullified, as it undermines trust in democracy.

4.    Dangers and impacts of hybrid warfare

This section will focus on the impacts of hybrid warfare and the dangers it poses to us. Many people would argue that the primary and sole goal of hybrid warfare during elections is to ensure the victory of the candidate who has been appointed and supported. It is one of the goals, surely, but there are many hidden ones, which are as dangerous as the victory of that candidate.

The first impact is an increase in disagreement about facts and analytical interpretations of facts and data. Of course, there are “facts” that are supported by small groups or individuals, but we will focus primarily on those that are widely supported by data and evidence, and the disagreement is increasing. For example, information about the safety of vaccines, which is supported by science, or a more ridiculous case, such as the phenomenon of the flat Earth.

In these cases, disagreement can be caused by the dissemination of disinformation, emotions, or misinformed biases that reject facts, data, and analysis. People often think that classical media are lying and are looking for other ways to gather information. This leads to an increase in disinformation and misinformation sites, which become a primary source of information for these individuals. Hybrid warfare attempts to propagate these sites or tries to damage the trust in classical media to prevent rational choice.

The second one is the growing number of opinions and personal experiences that override facts. These opinions and experiences are often shared primarily on social media, typically without any supporting evidence. Information is everywhere today: classical media are on 24 hours a day, everyone can post whatever they want on social media, and it is impossible to verify all the information. People are overwhelmed with it and do not know what to believe.

The way out of this confusion is often for people who shout above the information mess, whether they are telling the truth or lies. They see a strong individual who “must know” what is right and what is good for them. Unfortunately, these loud individuals are often exploited for the benefit of others and are merely pawns of the larger players. We can see this in every election in every country: a person who tries to disseminate anger and divide society.

The next threat and impact of hybrid attacks is a decline in trust in formerly respected media sources of information—confidence in major institutions, such as government, television or newspapers, declines. The percentage of people who express a high level of confidence in Congress decreased from 22 % in 1997 to 9 % in 2016. The trust is decreasing, and people are seeking alternatives, which in turn leads to the previously mentioned impacts.

This is not the case for all organisations: trust in the medical community and public schools has remained nearly the same. It is also notable that the decline in trust became more pronounced among certain groups than others. People who identify themselves as liberal or independent show no significant change in their trust in science, but those who identify themselves as conservative have seen a decline in recent years.

This all can lead to political paralysis. It is because people cannot agree on the basic facts, and neither can the politicians. It is creating a line between them, and they are pretending to battle between good and evil. It is also weakening the authority of government institutions. This paralysis can lead to ineffective decision-making, and it is much more costly: the entire country can be held back due to pervasive arguments and obstructions.

5.    Prevention and protection

The Hague Centre for Strategic Studies has released a document providing guidelines for addressing hybrid threats. They divided this process into five parts: Preparation Stage, Detection and Attribution Stage, Decision-Making Stage, Execution Stage, and Evaluation Stage.

The preparation stage is part of the defensive process, which is before the actual hybrid attacks. It enhances the capability to detect threats, allowing them to be addressed later. The first step is to set borders for what is unacceptable behaviour, which will trigger a response. When these definitions are sufficiently broad, they could prevent attackers from even attempting to attack, because they would know their actions would be beyond the neutral zone.

Step two is communication. If society is aware of the dangers of hybrid attacks and what constitutes a hybrid threat, they will not be surprised when such an incident occurs. Additionally, they could anticipate the reactions and not consider it radical. In contrast to public communication, there is also private communication. That will prepare the security centres on how to react, and everything can happen faster and clearly.

The second stage is the detection and attribution. From its name, it is evident that the main part of it is whole detection of a hybrid attack by implementing the detection capabilities developed in the previous stage. After the detection comes the consideration of the attribution options and deciding whether to act. If the decision is to act, it is necessary to explain it to the cybersecurity systems and to the allied third parties to gain support.

The third stage is decision-making. It is fundamental in any successful strategy. The first part of decision-making is choosing response options and identifying possible targets. Each option should also be considered in terms of its legality, the duration and timeframe in which it would be active, the proportionality of the countermeasure, and the potential escalation of the response. The second part is to assess the effectiveness of the response and its financial impact on the aggressor, as well as the state or institution that will execute this response. Before the entire execution, it is also essential to ensure that there is enough support for this counterattack.

In the fourth stage, we finally get to the execution of the counterattack. The response, which was considered the best one, is executed, and countermeasures are implemented. It should be done as soon as possible to protect the instances and to warn the aggressor that we are ready for a response. The whole process should be monitored so we can analyse it afterwards and provide reports to allies and to our leaders.

The last but not least stage is the evaluation. At this stage, the effectiveness of the response is assessed, and upgrades are implemented in the solution to achieve even better results.

This is a general procedure, but there are also safeguards in place to protect individuals. For instance, there is mandatory Watermarking as a reaction to the increasing amount of Deepfakes. AI Watermarking is the process of embedding a recognisable and unique signal (the watermark) into the output of an Artificial Intelligence system, which serves to identify whether the content is AI-generated or not. There are different watermarking techniques for texts, images, videos and audio.

The following practical example is the Digital Service Act (DSA). It is a regulation of the European Union that aims to “create a safer digital space where the fundamental rights of users are protected and to establish a level playing field for businesses” (The Digital Services Act package | Shaping Europe’s digital future, 2025, November 20). It, for example, ensures an easier way to report illegal content, goods and services, stronger protection for targets of online harassment and bullying, and it provides transparency around advertising or simplifies terms of conditions.

6.    Conclusion

Hybrid warfare transformed from spying, kidnapping and real executions to an everyday reality of “grey zone” attacks not only in electoral processes. These threats evolved into algorithmic-based operations and information overload. Everything became faster than before, and, more importantly, harder to detect. We are witnessing highly sophisticated Deepfakes and the separation of society.

These hybrid attacks are threats even for the biggest and strongest democracies in the world. Now we know that the most significant damage is not the change in the electoral results, but the undermining of trust in the government, politicians, scientists, and, mainly, democracy. The goal of the aggressors is often polarisation and societal separation, leading to state paralysis, where everything is slowed down and rational decision-making becomes impossible.

So, will the elections remain independent? And were they ever completely independent? Absolute independence is, in this age, only an illusion: elections will always be a battlefield in some way, but we must not accept defeat and give up to these aggressors. The solution is not the elimination of all threats, because it is impossible in this “digital world”, rather, it is about the progress of countries and democratic societies and how fast and effectively they can teach how to protect against hybrid threats.

References

Ray, S. (2024, Jan 3). 2024 Is The Biggest Election Year In History—Here Are The Countries Going To The Polls This Year. Forbes. https://www.forbes.com/sites/siladityaray/2024/01/03/2024-is-the-biggest-election-year-in-history-here-are-the-countries-going-to-the-polls-this-year/

Rid, T. (2020). Active Measures: The Secret History of Disinformation and Political Warfare. Profile. https://books.google.cz/books?id=lWltDwAAQBAJ

Nadal, L. de, & Jančárik, P. (2024). Beyond the deepfake hype: AI, democracy, and “the Slovak case”. Harvard Kennedy School Misinformation Review. https://doi.org/10.37016/mr-2020-153

du Cluzel, F. (2021). Cognitive Warfare. NATO Innovation Hub. https://innovationhub-act.org/wp-content/uploads/2023/12/20210113_CW-Final-v2-.pdf

Hoffman, F. (2007). Conflict in the 21st Century: The Rise of Hybrid Wars. Potomac Institute for Policy Studies. https://www.academia.edu/22883467/The_Rise_of_Hybrid_Wars

Countering hybrid threats. (b.r.). NATO. Got 18. November 2025, from https://www.nato.int/en/what-we-do/deterrence-and-defence/countering-hybrid-threats

Rogal, A., & Gurban, A. (2024, December 4). Declassified reports hint at „state actor” behind Georgescu’s campaign. Euronews. http://www.euronews.com/my-europe/2024/12/04/declassified-romanian-intelligence-suggests-state-actor-behind-georgescus-campaign

The Digital Services Act package | Shaping Europe’s digital future. (2025, November 20). https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package

Madiega, T. (2023). Generative AI and watermarking. EPRS | European Parliamentary Research Service. https://www.europarl.europa.eu/RegData/etudes/BRIE/2023/757583/EPRS_BRI(2023)757583_EN.pdf

Gray Zone Project | CSIS. (b.r.). Got 17. November 2025, from https://www.csis.org/programs/gray-zone-project

Hack-and-Leak Operations and U.S. Cyber Policy. (2020, August 14). War on the Rocks. https://warontherocks.com/2020/08/the-simulation-of-scandal/

Wardle, C., & Derakhshan, H. (2017, September 27). Information Disorder—Toward an interdisciplinary framework for research and policymaking. Council of Europe Publishing. Got 19. November 2025 from https://edoc.coe.int/en/media/7495-information-disorder-toward-an-interdisciplinary-framework-for-research-and-policy-making.html#

Pazzanese, C. (2016, December 14). Inside the hacked U.S. election. Harvard Gazette. https://news.harvard.edu/gazette/story/2016/12/inside-the-hacked-u-s-election/

Leonardson, J. E. (2020, March). Intelligence in Public Media. CIA – Center for Studies of Intelligence. https://www.cia.gov/resources/csi/static/active-measures-and-information-wars.pdf

Opinion polling for the 2024 Romanian presidential election. (2025). In Wikipedia. https://en.wikipedia.org/w/index.php?title=Opinion_polling_for_the_2024_Romanian_presidential_election&oldid=1311579582

Sedesatka.cz. (b.r.). Parlamentní volby na Slovensku 2023: Průzkumy a výsledky online. Got 20. November 2025, from https://www.sedesatka.cz/rubriky/politika/parlamentni-volby-na-slovensku-2023-pruzkumy-a-vysledky-online_1789.html

Atlantic Council experts. (2024, December 6). Romania annulled its presidential election results amid alleged Russian interference. What happens next? Atlantic Council. https://www.atlanticcouncil.org/blogs/new-atlanticist/romania-annulled-its-presidential-election-results-amid-alleged-russian-interference-what-happens-next/

Kirby, P., & Barbu, M. (2025, May 6). Romanian PM Ciolacu and party quit government after nationalist vote win. https://www.bbc.com/news/articles/cj3xk8prxy8o

RUSSIAN INTERFERENCE IN 2016 U.S. ELECTIONS. (b.r.). Federal Bureau of Investigation. Got 19. November 2025, from https://www.fbi.gov/wanted/cyber/russian-interference-in-2016-u-s-elections

Serviciul Român de Informații. (b.r.). Serviciul Român de Informații. Got 20. November 2025, from https://www.sri.ro/

Hulcoop, A., Scott-Railton, J., Tanchak, P., Brooks, M., & Deibert, R. (2017). Tainted Leaks: Disinformation and Phishing With a Russian Nexus (No. Citizen Lab Research Report No. 92). University of Toronto. https://citizenlab.ca/2017/05/tainted-leaks-disinformation-phish/

Bertolini, M., Minicozzi, R., & Sweijs, T. (April 2023). Ten Guidelines for Dealing with Hybrid Threats. A policy Response Framework

The Digital Services Act package | Shaping Europe’s digital future. (2025, November 20). https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package

The Repressive Power of Artificial Intelligence. (b.r.). Freedom House. Got 19. November 2025, from https://freedomhouse.org/report/freedom-net/2023/repressive-power-artificial-intelligence

Kavanagh, J., & Rich, M. D. (2018). Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. https://www.rand.org/pubs/research_reports/RR2314.html

Anghel, V. (2024, December). Why Romania Just Canceled Its Presidential Election. Journal of Democracy. https://www.journalofdemocracy.org/online-exclusive/why-romania-just-canceled-its-presidential-election/

+ posts

Číst více

Další články