How Americans Get Tricked Into Participating in Disinformation Campaigns

Reuters
August 31, 2020 Topic: Security Region: Eurasia Tags: RussiaDisinformationElectionBotsTrolls

How Americans Get Tricked Into Participating in Disinformation Campaigns

There are things that people can do to vaccinate themselves against the disinformation efforts of foreign adversaries.

Russia capitalizes on fear and doubt by weaving disinformation campaigns into vulnerable media sources. This is hardly a new phenomenon—manipulating information is something in which the Kremlin is well-versed. The concern only grows for the West’s ability to defend itself against cyber and psychological warfare, particularly when people’s dependence on communication technology is at an all-time high due to the pandemic.  

 

It seems that the only defense that the average Internet user has against disinformation is to check their news sources. But to legitimately combat Russian disinformation, people first must learn about its meaning and history, which then allows for additional, grassroots methods to use against disinformation.  

 

Fake news and disinformation are doing more than sowing discord at the dinner table, or even across the political aisle. Now, news credibility is a matter of life and death. It’s no secret that Russia’s disinformation campaigns intend to destabilize Western cultural and governmental institutions, particularly by creating an underlying sense of distrust and disillusionment with the existing systems.

 

The race for influencing information is not necessarily new, but Russia’s long-term campaigns and longer-term impacts pose a unique threat to the West. Take the KGB’s infamous Operation Infektion, which capitalized on the stigma surrounding the HIV/AIDS crisis. The ongoing impact of a phony article from an Indian newspaper published in the 1980s is staggering.  As recently as 2018, a Los Angeles-based study indicated that there was still a significant sense of distrust in the African-American gay community about the origins of HIV/AIDS, along with the U.S. government’s involvement in finding a cure for it.

 

Disinformation experts at organizations such as DFRLab have delved deep into Operation Secondary Infektion, a nod to its predecessor. Unsurprisingly, Kremlin-supported bots, hackers, and trolls are using the Internet to their advantage. Women acutely suffer at the hands of Secondary Infektion through gender-based disinformation. Those particularly vulnerable to this include women running for political office, or women journalists investigating authoritarian regimes.  

An outlandish example includes a viral video of a woman pouring bleach on manspreaders in the St. Petersburg metro. This was proven to be doctored and released by a Kremlin-backed subsidiary, all in a simple attempt to undermine feminism. More women are breaking the political glass ceiling, and there is heightened awareness of the continued disparities related to gender globally. When considering Operation Infektion’s long-term influence, it is frightening to think of the far-reaching impacts this type of disinformation will make for future generations of women. 

They say the truth can set you free—so now that people are aware of Russia’s motive for disinformation, why is their vulnerability to these campaigns still so palpable? The answer is simpler than many would expect and lies in its very definition. The term “disinformation” has permeated from the jargon of the intelligence community to everyday conversation. Many—including reputable news sources—use it interchangeably with “misinformation.”

The key difference between the two terms lies in intention. In an increasingly polarized atmosphere, it is easy to assume ill-intent when presented with fragmented or partisan information from the opposite end of the political spectrum. Democrats and Republicans alike fling the term “disinformation” at each other, especially as the presidential elections draw closer. Overusing the word inevitably trivializes its meaning, which in turn leaves Americans vulnerable to the legitimate danger that Russia’s disinformation campaigns pose against society. Interestingly enough, members of both parties share largely the same beliefs about the dangers of false information, with a few exceptions. Optimistically, this data suggests that Americans can have a unifying rallying point—which means they need to do more than just check their news sources. 

There are things they can do to vaccinate themselves against disinformation campaigns. Since the 1960s, psychologists have studied inoculation theory and its political impact by applying it to issues such as conspiracy theories and climate change denialism.  

The first key technique of inoculation is “threat,” which is feeling the need to protect one’s threatened set of beliefs. Studies indicate that college students and young adults are quick to defend their beliefs—particularly if they perceive that their thoughts or behavior are being controlled. Considering disinformation’s fundamental goal is to influence perception and behavior, an obvious solution is to stress this point to younger generations—especially on social media. 

Another key technique of inoculation is refutational preemption. This boils down to predicting common counter-arguments, in order to strengthen one’s own viewpoint. This is easily practiced by conducting meaningful research and engaging in challenging conversations. Instead of solely reading news that reinforces personal and political beliefs, use it intermittently as a research tool to understand recurring themes and stories weaved together by Russian disinformation. By thinking of disinformation as unified narratives that challenge Western values, one can spot suspicious content far more efficiently. This is easily done with a periodic, five-minute scroll through any number of investigative sites, such as EUvsDisinfo, DFRLab, Graphika, or Bellingcat

Along with inoculating ourselves against fake news, keep a critical eye out while viewing social media posts and memes. If there is a critical view that smacks of conspiracy—even in a simple, comical meme— do some digging. Memes are an easy way to trigger an inflammatory reaction and surprisingly have major political impact. Using memes as a delivery method for disinformation is known as memetic warfare. Memes trigger visceral, emotional reactions, but rarely reveal their creator. Therein lies the concern when memes turn political. One group even boasted of “meme-ing Trump into the White House.”  

Additionally, studies on confirmation bias indicate that people tend to remember things that validate their own viewpoints, rather than things that challenge them. This drives home the need to check source credibility—even if the overall sentiment of a shared post is agreeable. 

Currently, society is experiencing a collectively traumatic experience. Freedoms that people didn’t even know they had were stripped from them overnight. Annoying parts of daily life, such as a crowded bus or someone coughing without covering their mouth, are now a public health nightmare. Though many will find it silly to grieve over such a universal experience, staying aware of their emotional baseline is a critical practice when dealing with the emotionally-charged political and social landscape—including disinformation. 

A little critical thinking goes a long way. While the solutions proposed seem elementary when dealing with such a long-term, multilayered issue, simple grassroots approaches are every bit as important as institutional responses to this continued threat. If combined, there’s little doubt that disinformation will quickly fade from our social media feeds, and will hopefully only be found instead in our history books. 

Elinor Harty is a Program Coordinator at the Woodrow Wilson International Center for Scholars, and received the 2018 Critical Language Scholarship to study in Tbilisi, Georgia as a Russian Studies graduate of George Mason University. 

Image: Reuters