Following Russia’s annexation of Crimea in 2014 and the conflict in the southeast region of Ukraine, NATO deployed four battlegroups to Poland and the Baltic States. The Netherlands contributes troops to the battlegroup in Lithuania, which considers itself a ‘front state’ against the Russian Federation. Lithuania is a desirable target for Russian disinformation campaigns. How is the country targeted by Russia, and how does Lithuania protect itself? This article provides an explanation for Russian disinformation, or dezinformatsiya, its history, and how it is related to other known terms, such as active measures. It is paramount for societies, including the Dutch, to be well aware of the likelyhood of being targeted by Russian dezinformatsiya campaigns.

‘You cannot fight lies with lies’, says Ričardas Savukynas, a Lithuanian elf, ‘when I see there are propaganda movements which are directed at the preparation of war, I need to do something!’[1] Savukynas’ concern shows the kind of conflict nations are involved in today. It also depicts the tense atmosphere between the Baltic states and the Russian Federation over the past six years. The Baltics are seriously targeted by Russian influence campaigns, since Mother Russia still feels responsible for the safety of the ethnic-Russians and Russian speakers in the Baltics.[2] In Estonia and Latvia more than a quarter of the population is ethnic-Russian, although in Lithuania it is only about five per cent.[3] Lithuania is, in a different way, a desirable target for Russian disinformation; it borders on the Russian exclave of Kaliningrad and contains, together with Poland, the so-called Suwalki corridor: flat terrain between Belarus and Kaliningrad that may function as a perfect link-up passage for Russian troops if necessary.[4]

MCD Hille Hillinga

Dutch F-16s in Lithuania participate in NATO’s Baltic Air Policing mission. Photo MCD, Hille Hillinga

Following Russia’s annexation of Crimea in 2014 and the conflict in the Donbas, the southeast region of Ukraine, member-states at NATO’s 2016 Warsaw summit agreed to forward deploy four multinational battlegroups to the Baltics and Poland.[5] In 2017, the Netherlands armed forces largely contributed to the security of the Baltic states. It deployed four Dutch F-16s participating in the Baltic Air Policing task, one raiding squadron of marines for the Very High Readiness Joint Task Force, two ships and a submarine to the Standing Naval Forces and one infantry company as part of the German-led battlegroup in Lithuania.[6] Today, 270 Dutch military personnel are still attached to the battlegroup and stationed in Rukla, Lithuania, as ‘reassurance measures’ for eastern European Allies in NATO.[7] The Netherlands makes a significant contribution to the protection of NATO territory in the Baltics, and should therefore be aware of potential threats. In 2018, Christian Kamphuis warned in his Militaire Spectator article that Dutch troops in Lithuania are a likely potential target for Russian smear campaigns. Kamphuis described an incident in which Dutch soldiers, based in Lithuania, were falsely accused of being publicly drunk and disorderly.[8] In other words, Russia’s disinformation campaigns targeted against the Baltics are also of interest to the Netherlands. If only because the Netherlands can easily get involved, this article will concentrate on the following question: How are Russian disinformation campaigns used against Lithuania?

To answer this question, this article will provide an explanation for Russian disinformation, or dezinformatsiya, its history and how it is related to other known terms, such as active measures, propaganda and kompromat. It also shows present-day appearances of dezinformatsiya and details how Russian authorities are currently harassing Lithuania, and how Lithuania is protecting itself.

A description of dezinformatsiya

To determine what kind of disinformation the Russian Federation uses against Lithuania, the concept of dezinformatsiya must be scrutinised first. Van den Herik, Molendijk and Bouwmeester already made a distinction in their article between mis-, dis- and malinformation.[9] Disinformation is a carefully crafted message to mislead the decision-making elite or the public, with every message at least partially conforming to generally accepted beliefs. Without a considerable degree of plausible information, it is difficult to gain the victim’s confidence.[10] Otherwise the disinformation will not be accepted by its target audience.[11] Today the concept of dezinformatsiya is still used by Russian authorities and is reframed by Western experts as ‘Kremlin’s Weaponization of Information’.[12] Russian authorities use two different types of disinformation. The first category is offensive disinformation used to influence foreign decision-makers and public opinion abroad. The second category includes defensive disinformation, which Russian authorities employ to influence their own citizens.[13] This form of disinformation is primarily intended to combat the interference of the West in Russian society. Chief of Staff of the Russian armed forces Valery Gerasimov stated that the West, especially the United States, is using ‘weapons of mass disorganization’, such as cyber, media, intelligence services and diplomacy, to upset Russian society. Gerasimov considered these new conflict methods as a modern Trojan Horse.[14]

History of dezinformatsiya

The origin of dezinformatsiya is still debatable. Some experts are convinced that Joseph Stalin decided that disinformation should look as if it were originally French. He organized an information campaign in which the word dezinformatsiya seemed to be derived from the French language, being a portmanteau of the words ‘des’ and ‘information’. It was a meaningless expression, but another form of Russian ruse. Stalin made believe that dezinformatsiya was a French ‘capitalist’ tool targeted against the peaceful people of the Soviet Union.[15] Soviet intelligence officer Walter Krivitsky, who was born as Samuel Ginsberg and served in Germany, Poland, Austria, Italy, Hungary, and the Netherlands, had another view on the origin of dezinformatsiya, dating it back to the First World War. The German armed forces established a General Staff’s Disinformation Service to disseminate improper information and news in order to confuse their adversaries. The first Soviet secret service adopted the term and the underlying techniques and used it for its own purposes. It translated the term into dezinformatsiya. [16]

The Ukrainian military base in Crimea was surrounded by 'little green men', who were supported by a Russian desinformatsiya campaign 

While there had been successes during the early Cold War, dezinformatsiya did not catch on until the early 1960s. After the establishment of KGB’s Department D in 1959,[17] the unit was directly connected to the Presidium of the Soviet Communist Party, and its main task was the dissemination of dezinformatsiya. Department D consisted of forty to fifty personnel, divided by region and function. In 1962, Department D was upgraded to the status of a service, Service A, under direct supervision of the First Chief Directorate of the KGB. Ivan Agayants, a legendary KGB officer, became Chief of Service A. Five years after its foundation, Service A managed nearly 400 dezinformatsiya operations per year. Agayants had a strict policy of recruiting new personnel involved in the conduct of dezinformatsiya operations. A new agent needed to be able to think creatively, culturally empathically, and out-of-the-box, alongside possessing personal characteristics such as rigour, self-discipline and ideological determination.[18]

During the 1980s the Soviets often dealt with the use of dezinformatsiya in an opportunistic manner. Unplanned incidents were seized upon by the Soviet KGB to launch a major dezinformatsiya campaign. Examples include incidents such as the attack on Pope John Paul II in 1981 by a Turkish terrorist, which was regarded as a CIA retaliation. Another example is the shooting of the Korean airliner with flight number KAL007 over the Kamchatka Peninsula in 1983 by a Soviet Sukhoi Su-15 interceptor jet, resulting in 269 fatalities. This incident was initially surrounded by conflicting reports and eventually dismissed as a purely defensive measure that had been hard to avoid.[19]

After the dissolution of the Soviet Union, the interest of the West in dezinformatsiya faded into the background, but that suddenly changed after the annexation of Crimea in 2014. At the time, the World was shocked to see how masked soldiers in uniforms without insignias, later referred to as ‘little green men’ by the Western media, could take over an entire peninsula belonging to Ukraine without firing a shot. The action was attributed to the Russian Federation, but the Russian authorities remained silent and initially denied their involvement. The activities of the little green men were supported by a dezinformatsiya campaign, which came not only from the Russian media but also from Russian politicians. For example, in April 2014, Russian Foreign Minister Sergei Lavrov accused the West of being the initiator of all the unrest in Ukraine in order to get more control in the region.[20]

Relation with active measures

The Soviet Union and the Russian Federation have a long tradition of misleading groups of people with manipulated information. With roots in Leninist thinking, mainly aimed at controlling their own population and influencing public opinion, the Soviets developed a series of deceptive activities that invariably included terms such as dezinformatsiya, active measures, propaganda, and kompromat.[21] The question now is whether these are all different concepts. The answer is simply ‘no’, although these terms partly overlap, which is explained in the following sections. It is striking that these concepts are again widely used in Russian dezinformatsiya operations today.

Some Russian and Western experts in information warfare use the term dezinformatsiya to refer to what the Soviet leaders called ‘active measures’.[22] Although active measures are considered as just another term for dezinformatsiya, they are not quite the same. Dezinformatsiya is merely one of the overt and covert influencing practices used by the Soviet and later by the Russian leadership in these so-called active measures.[23] Soviet authorities viewed dezinformatsiya as a strategic weapon, useful in their overall active measure strategy. In turn active measures, or aktivnyye meropriyatiya, is a Soviet term for active intelligence operations to influence world events in order to reach one’s own geopolitical aim.[24]

Retired KGB General Oleg Kalugin saw dezinformatsiya as one of the critical components of active measures, together with subversive activities. Kalugin viewed subversion as ‘active measures to weaken the West, to drive wedges in Western community alliances of all sorts […] [and] sow discord among allies’.[25] Active measures focused on and exploited opponents’ vulnerabilities in order to expand Soviet influence and power around the globe.[26] Active measures vary from media forgeries to messages that can cause reactions with various degrees of violence. Active measures are broader than only disinformation, they include propaganda, subversive activities, counterfeiting official documents, disinformation operations leading to assassinations, agents of influence, political domination, and various forms of religious suppression.[27] Today the ‘old’ active measures are still present in current Russian activities, they only look different. Current active measures include modern dezinformatsiya and subversion methods, such as deploying Orthodox priests, Russian government-funded news media outlets like RT and Sputnik, spies and ‘computer hackers to ride and help create the wave of populist anger’.[28]

Relation with propaganda

Propaganda has a specific position within dezinformatsiya. In 1935, Leonard Doob, Professor of Psychology at Yale University, concluded that most propaganda uses stereotyping and suggestion. Stereotyping is the process in which people create mental images about human character traits and appearances and use these images to judge other people. In the case of propaganda, the propagandist constructs a picture or a narrative that his target group is ready to wholeheartedly accept.[29] This construction can be used as a stimulus to generate a suggestion, which affects people’s reaction and behaviour, and often their attitude.[30] A Harvard University study into Nazi propaganda emphasized the contrast between ‘Us versus Them’ as the main theme in propaganda.[31] The propagandist (‘Us’) tries to persuade the public by intensifying his own ‘good’, using glorifying wording, and downplaying his own ‘bad’, while he also intensifies the other party’s (‘Them’) ‘bad’, using denigrating language, and downplaying the other’s party’s ‘good’, denying its positive behaviour and actions.[32] The Russians know two specific forms of propaganda: agitprop and spetspropaganda.

Agitprop is a portmanteau of ‘agitation’ and ‘propaganda’. Agitation indicates the emotional part of propaganda, referring to how the message is received and to the mental state of the receiver. Propaganda, on the other hand, refers to the framing of the message and the way the message should be disseminated.[33] Agitprop is a form of political propaganda, especially communist, which was often used during the Soviet era. Emotional agitation puts the recipient in a condition in which he will act erratically and in a non-rational way. In order to reach a large audience, agitprop is spread to the general public through popular information channels, like literature, plays, movies, pamphlets, paintings and other art forms that all carry political messages, overtly or covertly.[34]

Lithuanian President Gitanas Nauseda (right) visited ‘General Silvestras Zukauskas’ Training Area in Pabrade and met with Lithuanian, German and United States troops and evaluated their readiness. Photo Office of the President of the Republic of Lithuania

Spetspropaganda, which is short for ‘special propaganda’, was first taught in 1942 as a separate subject at the Military Institute of Foreign Languages in Moscow. It was removed from the curriculum in 1990 but reintroduced in 2000 after the institute had been reorganized.[35] Spetspropganda was used for blocking influence and for applying pressure and manipulation. The Soviets used spetspropaganda in line with the social-technical principles of successful propaganda, which were: (1) the principle of a massive and long-lasting impact, (2) the principle of believing desired and manipulated information, (3) the principle of supposed obviousness, and (4) the principal of emotional agitation, like agitprop.[36] The creation of dezinformatsiya, agitprop and spetspropaganda did not stop after the collapse of the Soviet Union. These forms of influencing are still used by Russian authorities today. Established Russian media platforms, such as RT,[37] together with news agencies, such as Sputnik and Rossiya Segodny, create and disseminate story lines, frames, agitprop and spetspropaganda. These media outlets are still at the heart of Russia’s activities in the information environment.[38]

Relation with kompromat

Kompromat, meaning ‘compromising material’, is a special brand of dezinformatsiya, and refers to discrediting information that can ‘be collected, stored, traded, or used strategically across all domains: political, electoral, legal, professional, judicial, media, or business.’ Russian kompromat operations are machinations exercised through the circulation of often ‘unsubstantiated or unproven information’ (documents, messages, files, etcetera), which are destructive for all those involved. Kompromat has four ideal types, the first of which entails revelations about a victim’s political activities, such as abuse of power, discrediting connections, and political disloyalty. The second type involves a victim’s disreputable, sometimes illegal, economic activities, such as distrusted apportionment of budgets, fraudulent bank deals, capital flight, and preferential treatment in business agreements. The third type comprises accusations of victims taking part in criminal activities, including organized crime, contract killing, spying, tapping, and blackmail. The fourth type of kompromat contains revelations about a victim’s private life, especially the ones that were created to discredit the victim. This type includes details of illegitimate income or property, sexual behaviour, sexual orientation, health, and misbehaviour of family members of the victim. Kompromat does not necessarily have to be manipulated information, as the four types of kompromat mentioned, but may also be factual and accurate. To give an example of kompromat: in the summer of 1997 the Russian Minister of Justice, Valentin Kovalev, was removed from his position after a Russian newspaper, Sovershenno Sekretno, showed certain pictures with Kovalev in the arms of prostitutes in a sauna controlled by a criminal group called Solntsevskaia. The minister insisted that he was lured into a trap. [39]

Current appearances

Contemporary Russian activities in the information environment, including dezinformatsiya campaigns, are designed along the four elements of former disinformation operations, also known as the 4-D approach: dismiss, distort, distract, and dismay.[40] In 2007 Alexandr Bedritsky, a Russian strategist, wrote that the key of current Russian warfare is not to destroy the enemy’s morale or psyche or bring about physical destruction, but rather to form such a perception of reality that would be in line with Russian interests.[41] It may be argued that the contemporary way in which information and intelligence are gathered and possible opponents are manipulated makes Russia’s disinformation operations very effective. Russia’s covert activities include espionage, hacking, stealing, and laundering; it’s semi-covert actions consist, among other activities, of troll deeds, forgery, disruption, and amplification, while the overt method is to provide propaganda pushers and fake news launderers with improper information.[42] Erik Donkersloot rightly argued in his Militaire Spectator article that ‘Russian operations are mostly designed to disrupt hostile societies and fuel internal polarization in target nations’.[43] In line with these intentions, the tactics of Russian authorities are rather to confuse than to convince a target audience, and to divide opinions instead of providing new insights. By creating many different storylines, Russian authorities attempt to deny the audiences the ability to distinguish between truth and falsehood. On the other hand, the spokesperson of the Russian Ministry of Foreign Affairs often raised concerns about the risk of disinformation in the Western media, in which the Russian Federation is portrayed very negatively, and brazenly called on the United Nations to formulate a global strategy against disinformation and fabricated news.[44]

Kremlin Trolls

Dezinformatsiya operations can be conducted by Russian politicians and diplomats, mainstream media, non-governmental organisations (NGOs), or through cultural programmes and other means. One of the notable ways of distributing dezinformatsiya is through social media by the so-called bots, automated social media accounts, and ‘Kremlin Trolls’, fake social media accounts managed by Russian volunteers.[45] The Kremlin Trolls are part of the Russian Internet Research Agency (IRA). The IRA began its operations in 2013 in Saint Petersburg. From the start, the agency was run as a sophisticated marketing bureau in centralised office surroundings in Russia’s second city. The IRA employed and trained over a thousand people to conduct round-the-clock influence operations.[46] The IRA has often been called the ‘Troll Farm’ or the ‘Russian Troll Factory’.[47] The agency started out as the IRA and was later called Teka. Nowadays it is called Glavset, which was legally formed in 2015. It is interesting to note that Glavset’s corporate address is in Rostov-on-Don, but its physical address is in Saint Petersburg. Glavset is housed and financially supported by Yevgeny Prigozhin, also appropriately known as ‘Putin’s Chef’, as the President personally chose his company to cater several of his exclusive presidential receptions and dinners. Members of Glavset mask their internet activities using proxy servers and other anonymizers in order to astroturf.[48] Their main products are propaganda, fake news, and trolling, which is writing controversial reactions on comment sections of an article on the internet.[49]

The trolls or operators at Glavset work in twelve-hour shifts, on a 24/7 basis. The individual operators run multiple fake accounts and are expected to produce around fifty comments on news articles every day. Other operators maintain six Facebook accounts, posting three times daily about news and discussing new developments in Facebook groups twice a day, with a target of at least 500 subscribers at the end of the first month. On Twitter, operators run around ten accounts with up to 2,000 followers each and producing at least fifty tweets daily. The ones making comments are required to make 135 remarks during their shift. These operators are provided with five keywords or topics to use in their posting in order to stand out in search engines, as a result of which internet users end up on earlier postings.[50] The ultimate goal of the Kremlin Trolls is to initiate a gradual process of undermining Western democracies and disrupting democratic institutions in those nations.

Winter Wolf NATO

Lithuanian military personnel during NATO’s exercise Winter Wolf. Photo NATO

It is strongly believed that the Kremlin Trolls first targeted Ukrainian and Russian citizens and, subsequently, American citizens well before the United States elections in 2016.[51] Today, there is a strong suspicion worldwide that the Kremlin Trolls have played a misleading role in the conflict in the Donbass, in the narratives surrounding the cause of the downing of flight MH17, and furthermore they are accused of having been involved in the Brexit referendum,[52] the 2016 American elections, and leaking correspondence of French President Emmanuel Macron’s La République En Marche! (‘The Republic That Works!’).[53] Some nuance is needed in simply blaming the Kremlin Trolls for undermining Western democratic processes. In 2019, United States Special Counsel Robert Mueller declared that there was inadequate proof for a formal accusation of Russian authorities and their Kremlin Trolls.[54]

Dezinformatsiya campaigns in Lithuania

Like the two other Baltic states, Lithuania was one of the few former Soviet states to join the EU and NATO in 2004. After the annexation of Crimea in 2014, the Lithuanian government strongly disapproved of this Russian action. It became one of the chief advocates for an EU treaty with Ukraine and it is highly supportive of the EU sanctions against the Russian Federation and eager to assist Ukraine. Lithuania is also part of the avant-garde of NATO member states in raising awareness about Russian threats. Over the last six years, Lithuania has increasingly developed a frosty relationship with the Russian Federation.[55] As a counter-reaction, Russian authorities targeted the Lithuanian society with several dezinformatsiya campaigns, sometimes in the form of spetspropganda. ‘The Russian authorities try to create a manipulated history that denies Lithuania’s right to exist’, a top Lithuanian official explained in the British newspaper The Guardian.[56] Examples are the spreading of rumours that Lithuania’s capital of Vilnius should not belong to Lithuania because it was Polish territory in the interwar period of the past century, and Klaipėdia, Lithuania’s third largest bridge, never belonged to Lithuania but is supposed to be Russian property since it was a gift from Stalin.[57]

Over the last two years, Facebook has become one of the most important battlefields for dezinformatsiya operations. The Kremlin Trolls increased their efforts to polarise Lithuanian public opinion. The methods they use are known from previous online activities. Rather than pushing certain narratives, the Kremlin Trolls are disrupting public discourse by adopting extremist positions on both sides of Lithuania’s political spectrum, thereby attempting to split Lithuanian society, often by exploiting already sensitive and existing divisive topics. Kremlin Trolls also tried to influence demonstrations in Lithuania by using social media, such as Facebook, prior to the demonstrations. Kremlin Trolls’ working methods tend to start in neutral groups on Facebook, such as fan groups of pop stars or famous actors; accounts that attract a large number of followers. The posts in these Facebook groups are initially related to the subject of the group, and then slowly but steadily dezinformatsiya is actively inserted between neutral posts, thereby exposing the entire community belonging to that Facebook group to malicious disinformation. Kremlin Trolls usually organise their activities through VKontakte, the Russian version of Facebook, and then engage on Facebook.[58] Although Facebook is their favourite platform, Kremlin Trolls are also active on other social media platforms, such as YouTube, Instagram, Tumblr, Snapchat, Pinterest and Linkedin.[59]

During the spring of 2020 coronavirus-related information incidents grew over time and the Lithuanian population and NATO troops remained the main target of the dezinformatsiya campaign. Since the start of the COVID-19 pandemic, Kremlin Trolls have become increasingly active in using the opportunity to spread dezinformatsiya. Between February and April 2020, Lithuanian authorities identified a total of 869 coronavirus-related information incidents of various types, not only in Lithuanian, but also in Russian and English. Those who spread disinformation seek to capitalise on the COVID-19 pandemic to sow fear and tensions and turn public opinion against NATO troops in Lithuania.[60] The disinformation used is often a combination of agitprop and kompromat. In January 2020 a source, supposedly a Kremlin Troll, posted a made-up story on Lithuanian news website Kauno.diena.lt, or Kaunas Day, claiming that an American soldier of the U.S. Army’s 1st Cavalry Division based in Lithuania, was diagnosed with COVID-19. The story was removed after having been online for just a couple of minutes. In March 2020, a manipulated narrative was posted on the Baltic web portal Delfi.lt, claiming that the massive Allied Defender Exercise 2020, recently scaled back due to COVID-19 precautions, would still take place in Lithuania, but secretly.[61] In April 2020, a falsified statement of NATO Secretary General Jens Stoltenberg on the alleged withdrawal of NATO troops from Lithuania due the corona-crisis was sent by email across Lithuania to the press, government, as well as the NATO Headquarters in Brussels and the Lithuanian Defence Ministry.[62] These notifications are just a few examples of a larger campaign launched in an opportunistic abuse of the corona-crisis.

Lithuanian response

Russia’s aggression against Ukraine in 2014 caused a paradigm change in Lithuania’s strategic culture. One of the most significant impacts has been that defence took centre stage in political and societal life in a way not witnessed before in Lithuania since its independence in 1990. The state of the Lithuanian armed forces (LAF) came under intense scrutiny. Since 2014, the Lithuanian defence budget has grown with 20-30 per cent annually, making it the fastest growth in the world. As part of the changes, Lithuania instated conscription, which immediately sparked a huge wave of potential participants.[63] The current LAF consists of Land, Air and Naval Forces, a Special Operations Force, Military Police, a Logistics Command and a Training and Doctrine Command. The LAF includes about 20,000 soldiers in active service, while almost 6,000 reserve soldiers are part of the National Defence Volunteer Forces (NDVF).[64] Lithuania considers itself a ‘front state’ against the Russian Federation, with the LAF and NDVF being the armed nucleus of all its defence activities. The Lithuanian defence system is based on the concept of ‘total and unconditional defence’, as required by Lithuania’s 2012 National Security Strategy.[65]

Part of the change process was a latitude for security subcultures promoting non-military instruments, such as strategic communication and sophisticated cyber protection. A few years ago, members of the Lithuanian Special Operation Forces branch decided to establish the LAF Strategic Communications Department. In the meantime, this department has transformed into a unit with a civil-military structure. They have since become the top choice for Lithuanian public media regulators in seeking expert advice on suspected violations by Russian media of Lithuanian laws prohibiting war propaganda, or incitement to ethnic hatred. In addition, the employees of the department have become masters in detecting and all Russian media news transactions are closely monitored.[66] Today, the department also has far-reaching authority, such as the closing down of websites.[67]

MCD Jasper Verolme

Dutch military personnel of NATO’s eFP exercise in Lithuania. Photo MCD, Jasper Verolme

Besides these initiatives by the government, other steps have been taken to counter dezinformatsiya in Lithuania. The first private fact-checking initiatives in the country have emerged. The news portal runs a fact-checking initiative called checking news items, as the name implies, every 15 minutes, a project launched in 2016 by journalist Liepa Zelniene. Another project was established in 2017 by biggest news portal in Lithuania. collaboration with the military, journalists and civil society in detecting dezinformatsiya on the website which has also been funded by . In addition, totally different initiatives have also been launched. That is how media literacy became a hot topic in Lithuania. Together with critical thinking they are two of the top priorities in the Lithuanian government’s programme for the eradication of dezinformatsiya. The national strategy Lithuania 2030 aims to introduce media literacy programmes in all education institutions, from nursery schools to universities.[68]

An important part of Lithuania’s counter-disinformation strategy is that it does not only include government initiatives, but it extends well into the wider Lithuanian society. An example of these initiatives is the so-called ‘elves’, volunteers who set out to combat Kremlin Trolls, under the motto ‘elves can beat the trolls’. The size of the elves’ community changes constantly, but numbers in the thousands, and it includes journalists, IT professionals, businesspeople, students, and scientists. They all participate for a good cause: to prevent the Russian authorities and Kremlin Trolls from carrying out malicious dezinformatsiya campaigns in Lithuania. The elves consider themselves a movement, not an organisation. Their aim is to expose and combat false claims and contested narratives as quickly as possible. There are different types of elves, some of which are debunkers of manipulated information, while others run ‘blame and shame’ online campaigns against the Kremlin Trolls. In the Financial Times one of the elves stated: ‘In Lithuania we work in one direction, even with the media, which normally are competitors. When we need to defend our country against propaganda and dezinformatsiya, we are united!’[69]

Conclusion

This article focused on the question: How are Russian disinformation campaigns used against Lithuania? Russian disinformation, or dezinformatsiya, can be considered as a carefully crafted message to deceive the decision-making elite or the public of a target nation, community or group of people, with every message of disinformation at least partially conforming to generally accepted beliefs. Dezinformatsiya is not a modern invention but has been practised since the Soviet era and most dezinformatsiya operations are conducted by Russian politicians and diplomats, NGOs, the mainstream media and nowadays also frequently by Kremlin Trolls on social media. The main goal of these dezinformatsiya operations is to disrupt Western democracies, especially the three Baltic states. Since the Baltic states are both NATO and EU members, the dezinformatsiya problem is also becoming a concern for these two organisations and their member states.

Over the past six years, since the annexation of Crimea, Lithuania has seen an increase in the campaigns of the Russian authorities. They do not like Lithuania’s membership of the EU and NATO and regard the immediate proximity of these two organisations as a threat to their own stability. To do something about this threat the Russian authorities frequently target Lithuania’s population with dezinformatsiya, including non-factual information, spetspropaganda, agitprop and kompromat, in order to create disarray and chaos in Lithuanian society. However, it is not the only reason for Russian dezinformatsiya operations in Lithuania. Russian authorities are also seeking to drive a wedge between the population and foreign troops stationed in Lithuania under NATO’s enhanced Forward Presence (eFP), including Dutch military personnel. On top of that, the Russians do not shrink from being opportunistic and spread all sorts of slander about COVID-19 in Lithuania.

In Lithuania, a special Strategic Communications Department has been established within the Lithuanian armed forces to detect and, if necessary, eliminate dezinformatsiya. Other projects have also been launched with which the government, military personnel, and the Lithuanian media fight together against dezinformatsiya. Notable is the elves movement, which has led to a hard and grim information war under the guise of a ‘seemingly charming fairy tale’ starring elves and trolls.

Relevance for the Netherlands

Dutch government organisations and media often feel uncomfortable about far-reaching cooperation projects with the Netherlands armed forces in order to tackle unwelcoming information. On the other hand, to all intents and purposes, the Netherlands government would do well to consider setting up an inter-departmental unit to prevent unwanted interference via all sorts of manipulated information. Let’s be honest, in the security domain, the Netherlands suffers from a very serious form of the gullibility syndrome: ‘Oh well, it won’t happen to us, we are perfectly safe behind the dikes and surrounded by friendly nations, such as Germany, Belgium, France and the United Kingdom.’ However, Russian dezinformatsiya campaigns in other countries are a wake-up call for the Netherlands, its society, its government and its institutions. It should be kept in mind that the Netherlands’ firm and critical attitude towards Russia’s alleged involvement in the MH17 disaster, the support of Dutch government institutions for FBI revelations about Russian hacker groups,[70] the immediate expulsion of Russian security officials following the hack into the OPCW in The Hague, the solidarity with the United Kingdom during the Skripal affair, and Dutch military participation in NATO’s eFP in Lithuania, inevitably lead to a Russian response. It is therefore paramount for the entire Dutch society to be well aware of the likelyhood of being targeted, now and in the future, by Russian dezinformatsiya campaigns.

[1] NATO, ‘Elves vs Trolls – Fighting Disinformation in Lithuania’, YouTube, 3 May 2017. See: https://www.youtube.com/watch?v=KDsrwSX7piw.

[2] Agnia Grigas, Beyond Crimea: The New Russian Empire (New Haven, CT (USA), Yale University Press, 2016) 136; Ofer Fridman, Russian Hybrid Warfare: Resurgence and Politicisation (London (UK), Hurst & Company, 2018) 171.

[3] Jörg Noll et al, ‘De Baltische Staten, de Russische Minderheden en de Verdediging van de NAVO’, in: Militaire Spectator 186 (2017) (4) 173.

[4] Viljar Veelen, ‘Why It Would Be Strategically Rational for Russia to Escalate in Kaliningrad and the Suwalki Corridor’, in: Comparative Strategy 38 (2019) (3) 182-197.

[5] North Atlantic Treaty Organisation, Factsheet ‘NATO’s Enhanced Forward Presence’, May 2017. See: https://www.nato.int/nato_static_fl2014/assets/pdf/pdf_2017_05/1705-fact....

[6] Anne Bakker, ‘Dutch Perspectives on the Security of the Baltic States’, Clingendael Spectator, 20 December 2017. See: https://spectator.clingendael.org/nl/publicatie/dutch-perspectives-secur....

[7] Netherlands Ministry of Defence, ‘Current Missions’, 8 June 2020. See: https://english.defensie.nl/topics/missions-abroad/current-missions.

[8] Christian Kamphuis, ‘Reflexive Control: The Relevance of a 50-year-old Russian Theory Regarding Perception Control’, in: Militaire Spectator 187 (2018) (6) 337.

[9] Bo van den Herik, Tine Molendijk and Han Bouwmeester, ‘Zeg me dat het niet waar is…?

Een zoektocht naar Nederlands beleid en de rol van de krijgsmacht tegen desinformatie’, in: Militaire Spectator 189 (2020) (9) 418-429.

[10] Ladislav Bittman, The KGB and Soviet Disinformation: An Insider’s View (McLean, VA (USA), Pergamont-Brassey’s International Defense Publishers, 1984) 49.

[11] Ladislav Bittman, The Deception Game: Czechoslovak Intelligence in Soviet Political Warfare, Syracuse University Research Corporation (New York, NY (USA), Ballantine Books/Random House, 1972) 20.

[12] Peter Pomerantsev and Michael Weiss, The Menace of Unreality: How the Kremlin Weaponizes Information, Culture and Money, A special report presented by The Interpreter, Institute of Modern Russia, November 2014.

[13] Jon White, Dismiss, Distort, Distract, and Dismay: Continuity and Change in Russian Disinformation, Policy Brief, Issue 2016/13, (Brussels (BEL), Vrije Universiteit Brussel, Jean Monnet Centre for Excellence, Institute for European Studies, 2016).

[14] Frans van Nijnatten, ‘Het antwoord van Gerasimov op het Paard van Troje’, in: Militaire Spectator 188 (2019) (7/8) 394.

[15] Ion Mihai Pacepa, Disinformation: Former Spy Reveals Secret Strategies for Undermining Freedom, Attacking Religion and Promoting Terrorism (Washington, DC (USA), WND Books, 2013) 39.

[16] Walter Krivitsky, In Stalin’s Secret Service, Reprint (Frederick, MD (USA), University Publications of America, 1967) 234.

[17] KGB stands for Komitet Gosudarstvennoy Bezopasnosti, or Committee for State Security, the Soviet Secret Service.

[18] Thomas Rid, Active Measures: The Secret History of Disinformation and Political Warfare (London (UK), Profile Books, Ltd, 2020) 145-146.

[19] Michael Voslensky, ‘The Empire of Lies’, in: Raymond Sleeper, Mesmerized by the Bear: The Soviet Strategy of Deception (New York, NY (USA), Dodd, Meade & Company, 1987) 33.

[20] Steve Rosenberg, ‘Ukraine Crisis: West Wants to “Seize Control” – Russia’, BBC News, 25 April 2014. See: https://www.bbc.com/news/world-europe-27153909.

[21] Steve Abrams, ‘Beyond Propaganda: Soviet Active Measures in Putin’s Russia’, in: Connections: The Quarterly Journal 15 (2016) (1) 7.

[22] Richard Shultz and Roy Godson, Dezinformatsia: Active Measures in Soviet Strategy (McLean, VA (USA), Pergamon-Brassey’s International Defense Publishers, 1984) 39.

[23] Nicolas Cull et al., Soviet Subversion, Disinformation and Propaganda: How the West Fought Against It, An Analytic Report with Lessons for the Present (London (UK), London School of Economics and Political Science, LSE-consulting, 2017) 18.

[24] Aristedes Mahairas and Mikhail Dvilyanski, ‘Disinformation - Дезинформация (Dezinformatsiya)’, in: The Cyber Defense Review 3 (2018) (3) 21.

[25] Oleg Kalugin, op. cit. in: Mahaireas and Mikhail Dvilyanski, Disinformation, 21.

[26] Bittman, The Deception Game, 4-5; Matthew Lauder, Truth is the First Casualty of War: A Brief Examination of Russian Informational Conflict during the 2014 Crisis in Ukraine, Scientific Letter, DRDC-RDDC-2014-L262, (Ottawa (CAN), Defence Research and Development Canada, 2014) 3.

[27] Vasili Mitrokhin and Christopher Andrew, The Mitrokhin Archives: The KGB in Europe (London (UK), Penguin Books, 2000) E-Book.

[28] Or Honig and Ido Yahel, ‘The Art of “Subversive Conquest”: How States Take over Sovereign Territories Without Using Military Force’, in: Comparative Strategy 36 (2017) (4) 294.

[29] Leonard Doob, Propaganda: Its Psychology and Technique (New York, NY (USA), Henry Holt and Company, 1935) 35-37.

[30] Leonard Doob, Propaganda, 51-56.

[31] Karthik Narayanaswami, Analysis of Nazi Propaganda: A Behavioral Study (Cambridge, MA (USA), Harvard University, Faculty of Arts&Sciences, 2017) 4.

[32] Hugh Rank, ‘Teaching about Public Persuasion: Rationale and Schema’, in: Daniel Dietrich (ed), Teaching about Doublespeak (Urbana, IL (USA), National Council of Teachers of English, 1976) 3-20.

[33] Han Bouwmeester, ‘Lo and Behold Let the Truth Be Told: Russian Deception Warfare in Crimea and Ukraine and the Return of “Maskirovka” and “Reflexive Control Theory”’, in: Paul Ducheine and Frans Osinga (eds), Netherlands Annual Review of Military Studies 2017 (NLARMS 2017), Winning Without Killing: The Strategic and Operational Utility of Non-Kinetic Capabilities in Crises (The Hague (NLD), Springer/T.M.C. Asser Press, 2017) 138.

[34] Peter Kenez, The Birth of the Propaganda State: Soviet Methods of Mass Mobilization, 1917-1929 (Cambridge (UK), Cambridge University Press, 1985) 251-255.

[35] Viktoria Margaryan, ‘Russian Information Warfare’, (2014). See: https://www.academia.edu/9596147/Russian_information_warfare.

[36] Jolanta Darczweska, The Anatomy of Russian Information Warfare: The Crimea, Point of View Nr 24, (Warsaw (POL), Centre for Eastern Studies, 2014) 25.

[37] ‘RT’ is formerly known as ‘Russia Today’.

[38] Edward Lucas and Peter Pomerantsev, Winning the Information War: Techniques and Counter-strategies to Russian Propaganda in in Central and Eastern Europe, A Report by CEPA’s Information Warfare Project in Partnership with the Legathum Institute, (August 2016), 6.

[39] Alena Ledeneva, How Russia Really Works: The Informal Practices that Shaped Post-Soviet Politics and Business (Ithaca, NY (USA), Cornell University Press, 2006) 58-56.

[40] Maria Snegovaya, Putin’s Information Warfare in Ukraine: Soviet Origins of Russia’s Hybrid Warfare, (Washington, DC (USA), Institute for the Study of War, 2015) 12-13.

[41] Alexandr Bedritsky, Realization of the Concepts of Information Warfare by Military and Political Leadership of the USA during the Modern Era (Moscow (RF), Russian Institute for Strategic Studies (RISI), 2007).

[42] Max Bergmann and Carolyn Kenney, War by Other Means: Russian Active Measures and the Weaponization of Information (Washington, DC, Center for American Progress, June 2017).

[43] Erik Donkersloot, ‘Hybrid Threats from the East: The Gerasimov Doctrine and Intelligence Challenges for NATO’, in: Militaire Spectator 186 (2017) (9) 395.

[44] Alexander Averin, ‘Russia and its Many Truths’, in: Jente Althuis and Leonie Haiden (eds), Fake News: A Roadmap (Riga (LTV), NATO Strategic Communications Centre of Excellence/London (UK), The King’s Centre for Strategic Communications, 2018) 59-60.

[45] Todd Helmus, Russian Social Media Influence: Understanding Russian Propaganda in Eastern Europe, Addendum (Santa Monica, CA (USA), RAND Corporation, 2018) 3.

[46] Renée DiResta et al, The Tactics & Tropes of the Internet Research Agency, A Report Supported by the United States Senate Select Committee on Intelligence (Austin, TX (USA), New Knowledge, 2018) 6.

[47] Adrian Chen, ‘The Agency’, New York Times Magazine, 2 June 2015. See: https://www.nytimes.com/2015/06/07/magazine/the-agency.html.

[48] ‘Astroturfing’ is the practice of masking the originator of a message to make it appear as though it derives from and is supported by a grassroots participant.

[49] Joel Harding, ‘Glavset is the New Name for Russian Internet Research Agency: The Russian Troll Farm’, To Inform is to Influence, 10 September 2017. See: https://toinformistoinfluence.wordpress.com/2017/09/10/glavset-is-new-na....

[50] Andrew Dawson and Martin Innes, ‘How Russia’s Internet Research Agency Built Its Disinformation Campaign’, in: The Political Quarterly 90 (2019) (2) 246; John Gallacher and Rolf Fredheim, ‘Division Abroad, Cohesion at Home: How the Russian Troll Factory Works to Divide Societies Overseas But Spread Pro-regime Messages at Home’, in: Sebastian Bay (ed), Responding to Cognitive Security Challenges (Riga (LTV), NATO Strategic Communications Centre of Excellence, 2019) 61-80.

[51] DiResta, The Tactics & Tropes, 6.

[52] Georgina Lee, ‘Here Is What We Know About Alleged Russian Involvement in Brexit’, 4 News, Channel 4, 16 November 2017. See: https://www.channel4.com/news/factcheck/heres-what-we-know-about-alleged... Nick Cohen, ‘Why Isn’t There Greater Outrage about Russia’s Involvement in Brexit?’, The Guardian, 17 June 2018. See: https://www.theguardian.com/commentisfree/2018/jun/17/why-isnt-there-gre... ; Brittany Kaiser, Targeted: The Cambridge Analytica Whistleblower’s Inside Story of How Big Data, Trump and Facebook Broke Democracy and How It Can Happen Again (New York, NY (USA), HarperCollins Publishers, 2019) 333-353.

[53] Andy Greenberg, ‘Don’t Pin the Macron Email Hack on Russia Just Yet’, Wired, 5 August 2017. See: https://www.wired.com/2017/05/dont-pin-macron-email-hack-russia-just-yet/.

[54] United States Department of Justice (US DOJ), ‘Special Counsel Robert S. Mueller III Makes Statement on Investigation into Russian Interference in the 2016 Presidential Election’, 29 May 2019. See: https://www.justice.gov/opa/speech/special-counsel-robert-s-mueller-iii-....

[55] Kremlin Watch, ‘Lithuania’, June 2020. See: https://www.kremlinwatch.eu/countries-compared-states/lithuania/.

[56] Emma Graham-Harrison and Daniel Boffey, ‘Lithuania Fears Russian Propaganda is Prelude to Eventual Invasion’, The Guardian, 3 April 2017. See: https://www.theguardian.com/world/2017/apr/03/lithuania-fears-russian-pr....

[57] Graham-Harrison and Boffey, ‘Lithuania Fears Russian Propaganda’.

[58] Jacob Willemo, Trends and Developments in the Malicious Use of Social Media (Riga (LTV), NATO Strategic Communications Centre of Excellence, 2019) 25.

[59] Keir Giles, James Sherr and Anthony Seaboyer, Russian Reflexive Control (Kingston, Ontario (CAN), Royal Military College of Canada, Defence Research and Development Canada, 2018) 30; Christian Bell, Use of Social Media as an Effort, Multinational Capability Development Campaign, (Mayen (GER), Zentrum für Operative Kommunikation der Bundeswehr, 2016).

[60] BNS/TBT Staff, ‘Lithuanian Military Warns of Increase in Coronavirus-related Disinformation’, The Baltic Times, 27 April 2020. See: https://www.baltictimes.com/lithuanian_military_warns_of_increase_in_cor....

[61] Patrick Tucker, ‘Russia Pushing Coronavirus Lies as Part of Anti-NATO Influence Ops in Europe, Defence One, 26 March 2020. See: https://www.defenseone.com/technology/2020/03/russia-pushing-coronavirus....

[62] Baltic News Service Staff, ‘Fake News on NATO withdrawal from Lithuania Sent to Media, Brussels, LRT, 22 April 2020. See: https://www.lrt.lt/en/news-in-english/19/1166199/fake-news-on-nato-withd....

[63] Kristine Atmante, Riina Kaljurand and Tomas Jermalavičius, ‘Strategic Cultures of the Baltic States: The Impact of Russia’s New Wars’, in: Katalin Miklóssy and Hanna Smith (eds), Strategic Culture in Russia’s Neighborhood: Change and Continuity in an In-Between Space (Lanham, MD (USA), Lexington Books, 2019) 67-69.

[64] International Institute for Strategic Studies (IISS), The Military Balance 2019 (London (UK), IISS, 2019) 125.

[65] Masha Hedberg and Andres Kasekamp, ‘Baltic States’, in: Hugo Meijer and Marco Wyss (eds), The Handbook of European Defence Policies & Armed Forces (Oxford (UK), Oxford University Press, 2018) 226.

[66] Atmante, Kaljurand and Jermalavičius, ‘Strategic Cultures of the Baltic States’, 67-69.

[67] VPRO Tegenlicht, ‘Aan het Front van de Informatieoorlog’, Directed by Mea Dols de Jong, 17 May 2020. See: https://www.vpro.nl/programmas/tegenlicht/kijk/afleveringen/2019-2020/aa....

[68] Viktor Denisenko, ‘Lithuania: Disinformation Resilience Index’, Ukrainian Prism Foreign Policy Council, 31 July 2018. See: http://prismua.org/en/9065-2/.

[69] Michael Peel, ‘Fake News: How Lithuania’s “Elves” Take on Russian Trolls’, Financial Times, 4 February 2019. See: https://www.ft.com/content/b3701b12-2544-11e9-b329-c7e6ceb5ffdf.

[70] Huib Modderkolk, ‘Dutch Agencies Provide Crucial Intel about Russia’s Interference in US-elections’, de Volkskrant, 25 January 2018. See: https://www.volkskrant.nl/wetenschap/dutch-agencies-provide-crucial-inte... Max Smeets, ‘The Netherlands just Revealed its Cybercapacity. So What Does That Mean?’, Washington Post, 8 February 2018. See: https://www.washingtonpost.com/news/monkey-cage/wp/2018/02/08/the-nether....

Over de auteur(s)

Prof. dr. A.J.H. Bouwmeester MMAS

Han Bouwmeester is hoogleraar militair-operationele wetenschappen bij de Faculteit Militaire Wetenschappen aan de Nederlandse Defensie Academie.