Russia’s Gray Zone Threat after Ukraine
Moscow may be economically weak, and its conventional military is a far cry from the feared Red Army of the Cold War. But Russia is far from down and out.
Despite Prigozhin’s death, the Kremlin could also use private military companies in the Middle East, Africa, and other regions to increase Russian influence, undercut U.S. leadership, present itself as a security partner, and gain military access and economic opportunities. Russia could also work with partners like Iran to covertly target U.S. or other NATO forces overseas. During the war in Syria, Russia and Iran worked closely with Lebanese Hezbollah and Iraqi militias to retake territory for the Bashar al-Assad government. These covert tools give Moscow numerous options to hit back at the West.
Cyber Operations
Russian security agencies, such as the GRU, SVR, and FSB, have increasingly conducted cyber attacks to target critical infrastructure, undermine democratic institutions, steal government and corporate secrets, and sow disorder within or between Western allies. In some cases, Russia has conducted cyber attacks in tandem with military or paramilitary operations.
One frequent tactic is to sabotage adversaries’ critical infrastructure or to plant malware in critical infrastructure for use in a future war. Russian malware is designed to do a range of malicious activities, such as overwriting data and rendering machines unbootable, deleting data, and destroying critical infrastructure, such as industrial production and processes. Russia and Russian-linked hackers use a range of common intrusion techniques, such as exploiting public-facing web-based applications, sending spear-phishing e-mails with attachments or links, and stealing credentials and using valid e-mail accounts.
In 2017, for example, the GRU deployed NotPetya, a data-destroying malware that proliferated across multiple networks before executing a disk encryption program, which destroyed all data on targeted computers. NotPetya’s global impact was massive, disabling an estimated 500,000 computers in Ukraine, decreasing Ukraine’s GDP by 0.5 percent in 2017, and affecting organizations across sixty-five countries. Global victims included U.S. multinational companies FedEx and Merck, which lost millions of dollars because of technology cleanup and disrupted business.
In 2022, Russia conducted multiple cyber operations against Ukraine’s critical infrastructure. A day before the invasion, Russian attackers launched destructive wiper attacks on hundreds of systems in Ukraine’s energy, information technology, media, and financial sectors. Russia’s goal was likely to undermine Ukraine’s political will, weaken Ukraine’s ability to fight, and collect intelligence that Russia could use to gain tactical, operational, and strategic advantages. Over the next several weeks, Russian actors linked to the GRU, FSB, and SVR conducted numerous cyber attacks utilizing such malware families as WhisperGate and FoxBlade.
The West is also a target. In 2020, the SVR orchestrated a brazen attack against dozens of U.S. companies and government agencies by attaching malware to a software update from SolarWinds, a company based in Austin, Texas, that makes network monitoring software. The DarkSide, a hacking group operating in part from Russian soil, conducted a ransomware attack against the U.S. company Colonial Pipeline, which led executives to shut down a major pipeline for several days and created fuel shortages across the southeastern United States.
In 2023, Polish intelligence services claimed that Russia hacked the country’s railways in an attempt to disrupt rail traffic in the country, some of which are used to transport weapons to Ukraine. According to U.S. government assessments, Russia has targeted the computer systems of underwater cables and industrial control systems in the United States and allied countries. Compromising such infrastructure facilitates and demonstrates Russia’s ability to damage infrastructure during a crisis.
Russian agencies also use cyber attacks during elections to undermine faith in democracy by influencing public sentiment during an election campaign and raising questions about the democratic process. Moscow has targeted specific candidates by stealing or forging documents and then leaking them on public websites or social media platforms. Often referred to as “hack-and-leak operations,” the objective is to undermine faith in political candidates. Another tactic is to disrupt the voting or counting process by targeting computer systems. In addition, Russia has conducted cyber attacks during elections in an attempt to influence issues of importance to Moscow. For example, Russian security agencies have conducted cyber attacks during multiple European elections to weaken support for the European Union, NATO, and the United States.
The breadth of Russian activity is impressive. Russian cyber campaigns have attempted to disrupt elections in the United States, France, Germany, the Czech Republic, the Netherlands, Spain, Italy, Bulgaria, Austria, and dozens of other countries, according to the Dyadic Cyber Incident Database compiled by U.S. academics. These attacks are likely to continue, including during the 2024 U.S. presidential election campaign.
Disinformation
Russia has long used disinformation, often more effectively than its rivals, to supplement other tools and as a weapon by itself. During the Cold War, the Soviet Union successfully promoted the falsehood that the CIA was linked to the assassination of President John F. Kennedy and that U.S. scientists invented the AIDS virus, a campaign referred to as Operation Denver.
Russia uses information campaigns abroad to make the Putin regime look good at home. By highlighting pro-Russian sentiment in Europe, the corruption of Russia’s enemies, and unpopular European policies on immigration, Moscow tries to make its own regime more popular as well as discredit its enemies. Similarly, Russia has tried to create an image of itself as a muscular Christian nation, contrasting its policies with LGBTQ+ and immigrant-friendly Europe and the United States.
Beyond bolstering Putin, disinformation is a way to weaken and divide Russia’s enemies. Famously, the Internet Research Agency, a Russian troll farm, used disinformation in an attempt to influence the 2016 U.S. election, seeking to discredit Secretary of State Hillary Clinton and promote Donald Trump. On YouTube, Instagram, Facebook, Twitter, and other social media platforms, trolls pushed propaganda on immigration, race, and gun rights to conservative accounts while other parts of the Russian effort encouraged Black Americans to protest, inflaming tension among Americans.
In subsequent years, Russia has spread disinformation related to COVID-19 and other conspiracies, used a false news site in 2020 to get legitimate U.S. journalists to write stories on social disruption in the United States, and magnified the potential side effects of COVID-19 vaccines to decrease support for the Biden administration.
Ukraine is both a subject and a target of disinformation. In the years between Russia’s 2014 proxy war and 2022 invasion, Russian propaganda stressed that Ukraine was a failed, Nazi-led state, whose army was brutal to the local population. After Russia invaded Ukraine, the disinformation machine kicked into overdrive, both to justify the invasion at home and to undermine support for helping Ukraine abroad. To European audiences hosting large numbers of refugees like Poland, Russian propaganda claimed that the government was helping refugees over their own citizens. In Africa and other parts of the developing world, Moscow pushed the idea that the EU had banned Russian agricultural products while keeping Ukraine’s grain, causing a global food crisis.
Russia exploits overt and covert information sources, ranging from official government media to disinformation via government agencies, often in combination. Russia’s Foreign Ministry, for example, has played up false reports from Russian media of immigrants raping a thirteen-year-old Russian-German girl to stir up divisions in Germany and accusing the German government of not doing enough to protect its people, a sentiment that undermined German confidence in government and bolstered Russia’s image as tough on criminal immigrants. Even the Russian Orthodox Church, whose patriarch is staunchly pro-Putin, is involved. The church spreads propaganda while allowing its facilities to be used as safe houses for Russian priests to work with Russian intelligence agents.
Social media offers numerous, and cheap, additional ways to spread disinformation. Moscow uses fake accounts, anonymous websites, bots, and other means to spread its message, often using these sources to spread RT and Sputnik propaganda and to provide “evidence” for further lies from official media. Some of this involves troll accounts monitored by humans. Moscow also uses bots to try to amplify content and tries to exploit social media company algorithms to target particular audiences. At times, Russia will create innocuous accounts focused on health, fitness, or sports and then later, when they have a substantial following, begin to introduce political messages.
The wide array of actors each has its own audience. In addition, they amplify each other, with state voices and seemingly independent ones validating each other. Halting some are more difficult than others: it is one thing to block Russian state television or take down fake accounts, but it is another to block the Orthodox Church with millions of adherents outside Russia.
Generative AI offers a new means of disinformation. At the outset of the Ukraine war, Russia attempted to use a deepfake of President Volodymyr Zelenskyy that led it to appear that he had fled the country and was urging troops to lay down their arms. Less dramatically, Russia spread deepfakes on Facebook and Reddit that showed Ukrainian teachers praising Putin. The technology has improved by leaps and bounds since then. Deepfakes will be increasingly cheap and easy to produce, and this can be done at scale, allowing Russia to flood the zone with convincing falsehoods.
There are myriad potential uses of deepfakes. Russia’s attempt to blame Ukraine for instigating the 2022 invasion could be more convincing in the future by “leaking” deepfakes of Ukrainian generals planning an attack on Russian territory. Moscow can spread scurrilous rumors about anti-Russian leaders and undermine their political support by releasing fake videos of them in compromising situations or saying offensive remarks. Moscow could try to further polarize the United States or other countries, worsening existing racial tension by releasing videos of supposedly violent Black Lives Matter rallies or of police abuses of members of minority communities. In Europe, variations of this might play out with anti-migrant videos showing migrants committing rape and murder, often mixing genuine crimes and violence with false information.