How Russia Found a Disinformation Haven in America

Reuters

How Russia Found a Disinformation Haven in America

There is no good solution to the election-meddling efforts foreign countries are making via social-media platforms. Policymakers should expect that tech sector regulatory proposals will be far from adequate.

 

Americans continue to discuss Russia’s information operations efforts in the wrong way. We have wasted time debating whether “Russia” or “Russians”—the government or government-connected individuals—meddled in the 2016 U.S. presidential election. The Mueller Report definitively established that the Russians, both through the Main Intelligence Directorate (GRU) and the Internet Research Agency (IRA), undertook information operations campaigns. This has been reasonably clear for a long time—even when excluding evidence put forward by government sources, for the benefit of the paranoid. Some on America’s Left and in the Center have been unable to drop the idea that President Donald Trump could not have won without foreign help. Likewise, the American Right has been unable to drop the idea that there is a “deep state” Leftist media conspiracy bent on undermining a democratically elected president. And for now, we will leave aside critiquing the collective shock that foreign “meddling” could influence elections in the United States, a nation that has worked to promote its interests, in small or large ways, in many elections around the world—including every election conducted in post–Soviet Russia.

Framing the disinformation issue through this debate misses the point. The goal of the information operations campaigns was not simply to elect Donald Trump president. Nor was it only to polarize American politics further. The point was, rather, to continue undermining America’s ability to agree on the true and not-true.

 

Russia’s strategy hinged on the fact that it is nearly impossible for people stuck in alternate realities with competing, incompatible truth claims to undertake civil discourse. Secretary of State Hillary Clinton and her colleagues did not, in fact, organize a pedophilic, Satanic, human trafficking operation in the basement of a DC pizza parlor. Trump did not, in fact, conspire with the Kremlin to deliver an unexpected electoral result because he had been compromised by the video capture of salacious activities in Moscow hotel rooms. Things were much simpler than that. And yet, one can find people on either end of the political spectrum who are still convinced that these false stories are true. It is difficult for these Americans to engage in rational conversation together.

How Americans Earned the Right to Their Own Facts

Facebook representatives testified that the IRA’s disinformation efforts on their platform cost around a hundred thousand dollars. We do not know what IRA campaigns on Twitter, YouTube, Google or Instagram cost, as the social-media platforms face no legal obligation to disclose this information. Whatever it was, the total figure likely amounted to a rounding error in Russia’s grand strategy budget, a number so small as to hardly warrant high-level authorization. It is hard to believe that President Vladimir Putin gets bogged in the weeds of operations costing tens of thousands of dollars—but how such an inexpensive strategy achieved such dramatic success is a fascinating question. By “success,” we mean heightening the epistemological fragmentation of American society. Trump’s election to the presidency was not the point; almost no one believed that was the plausible outcome. It does not seem as though Trump himself believed it to be likely, much less “the Russians,” no matter how clever they might be. The Russian campaign would have still succeeded if Clinton had won.

The IRA’s efforts in 2016 worked because multiple factors have primed Americans to disagree on the facts. Undoubtedly, power differentials in our society allow some Americans to minimize or ignore the experiences of others. And our loss of shared truth is certainly shaped by structural factors. An increasingly cutthroat news industry, for reasons baked into the media business model, has exacerbated this trend. Journalists and academics have chronicled the collapse of print news since the aughts. Last summer saw a revolution in the television sector, as TV behemoths consumed one another to stay competitive in an environment overturned by the Internet. And, despite destabilizing the playing field for their predecessors, online media companies also face a shaky future. Many digital news platforms that previously set the tone for twenty-first century journalism, including Vice, Vox, BuzzFeed and the Gizmodo Media Group, conducted layoffs in early 2019 amid persistent questions about the viability of their models. All of this raises questions about where we will locate reliable information in the future.

It is hard, however, to overstate the Internet’s role in our current situation. In the United States, 62 percent of American adults receive their news from social media and 44 percent get their news from Facebook. Programming choices from the unregulated social-media sector now drive how information, fake or otherwise, gets funneled to American consumers. While the social-media revolution initially invited fantasies that technology would usher the world into a single epistemological space, it is hard to maintain faith that the Internet will grant everyone access to universal facts. The Internet has manifested, instead, the most profound fragmentation of our epistemological space. Now, like people in many other societies, Americans live in our own, exquisitely tailored, informational worlds. This is a vulnerability that other countries can exploit as they wish— and is something that Americans have exploited as well.

The Rough Path Forward

We have settled on improbable solutions for what to do from here. Some say that democratic countries should tackle disinformation collectively. Surveying recent proposals in this category of suggestion reveals vague solutions utilizing emotive buzzwords like “democracy” and “joint action.” Some call for miracles of leadership—perhaps for the NATO countries to unite around an action plan that would actually deliver results. Unfortunately, one decade after WikiLeaks has entered the public lexicon, we probably lack the soft-power bandwidth to negotiate a counter-disinformation framework within the community of democratic nations. For centrist leaders like Angela Merkel, Emmanuel Macron or Jean-Claude Juncker, a partnership with Trump on an initiative inherently tied to sticky issues of culture and power would be political suicide. In the face of rising populism, policymakers should remember that corralling money and manpower for jointly tackling the disinformation issue will require supreme force of will, the ability to articulate why this is important to suspicious publics, and persistent humility in such communications.

The need for tech-sector regulation is recognized by almost everyone following the disinformation story in good faith. Not even the companies are suggesting that they can handle the issue on their own anymore. Even as tech executives claim, however, to want to work with the public sector, Facebook, Twitter, and Google have been extremely recalcitrant to offer up data about the extent of the state-led disinformation problem on their platforms. And tech-sector leaders have vacillated dramatically on the topic within a couple of years.

We can trace this through the rather concerning statements that these leaders have made about regulation and disinformation since 2016. Twitter’s Jack Dorsey now “generally” thinks that “regulation is a good thing.” But last year, Dorsey asserted that it was the responsibility of journalists to “document, validate and refute” disinformation spread through his company, “so people can form their own opinions”—rather than it being the responsibility of Twitter, or the government, to address the quirks that state-sponsored information operations teams use to drown real discourse with strategic messaging. Likewise, the yawning gap between Mark Zuckerberg’s November 2016 assertion that claiming state actors used Facebook to influence Trump’s win was “crazy,” his public questioning of the need for social-media regulation in 2018, and his March 2019 statement that governments should take a “more active role” in regulating his industry, is curious.

 

Policymakers should expect that tech sector regulatory proposals, in calling for reform on their own terms, are going to be inadequate. As such, it is dispiriting that congressional politicians, many of whom would be unable to manage their personal social-media strategies without the help of unpaid interns, have yet to put forward a cohesive vision for how to deal with the problem. The Trump administration seems even less likely to step up on this issue.

A Fundamental Recalibration

Yet, even strong regulation paired with pan-democratic cooperation will not be enough. What the United States —and indeed, what every nation, needs—is a fundamental recalibration of public discourse. Almost seventy years ago, in the Origins of Totalitarianism, Hannah Arendt wrote that publics had “reached the point where they would, at the same time, believe everything and nothing, think that everything was possible and nothing was true.” The early twentieth century, another era of rising populism, extreme political polarization, and epistemological fragmentation, taught the West harsh lessons: that refusing to confront reality together is dangerous, and that things can fall apart remarkably quickly when nothing feels true. We can learn these lessons again—perhaps this time without the chaos and calamity.

Minimally, we must develop societal literacy around how disinformation campaigns function. This learning will differ across generations. Those of us who are older—and not necessarily very old, but perhaps older than forty—should be invited to understand the security threat posed by our consumption of half-truths and blatant untruths on social media. Though younger generations may be warier about what they see online, they could benefit from more education on state-led strategic messaging. In an ideal world, this education would be academically and globally oriented, and separated as much as possible from partisan bickering.