Wars Are Bad For Free Speech—The Coronavirus War Will Be No Exception

School media department staff are editing online classes, following the outbreak of the coronavirus disease (COVID-19) in the holy city of Karbala, Iraq March 26, 2020. REUTERS/Abdullah Dhiaa Al-Deen
March 29, 2020 Topic: Politics Region: Americas Blog Brand: The Buzz Tags: Freedom Of SpeechSocial MediaMedia BiasCoronavirusCOVID-19

Wars Are Bad For Free Speech—The Coronavirus War Will Be No Exception

But there's a glimmer of hope when it comes to social media.

The prominent legal scholar Geoffrey Stone reminds us that war is a perilous time for freedom of speech. The struggle with COVID-19 seems like a war. Some have evoked executive authorities created for, and justified by, wartime exigency. Unity will be needed to defeat this “invisible enemy.” How is free speech doing in this difficult time?

Speech may be restricted by public or private authorities. Public officials have strong incentives to censor or restrict speech perhaps especially during a crisis; hence, the First Amendment limits their powers over “the freedom of speech.” Content moderators also may restrict speech. Their powers in this regard are limited largely by their own commitments to free speech and consumer choices.

Some saber rattling by local police departments aside, the government has done little to limit dissent or a diversity of views. Yesterday, the Democratic leadershipproposed a stimulus bill that imposed additional disclosures and banned lobbying by companies receiving aid. This proposal has little chance of becoming law, though it bears watching.

Social media platforms have been active in both advancing and suppressing speech. Most tech companies are providing their users with expert information about COVID-19. Facebook is actively trying to steer [users] toward authoritative sources” about the pandemic. By their own accounts they are also suppressing a lot of misinformation. Facebook has also devoted an extra million dollars to fact checking claims on its platform, though much of its emergency moderation effort has been focused on the less politically salient, though more immediately harmful threat of mental health crises fostered by isolation. Suppression can be legitimate as a recent case shows.

Shortly after the realities of the Coronavirus pandemic took hold in the United States, a young Californian technologist named Aaron Ginn wrote a paper arguing that the government’s response to the virus was overblown and costly. He posted the essay to Medium, an online platform specializing in hosting such writings. Less than a day later, the moderators at Medium removed the Ginn essay. The Ginn essay attracted extensive criticism on Twitter from Carl T. Bergstrom, a professor of biology at the University of Washington who noted that the paper was getting “too much traction here and even in traditional media.”After the removal from Medium, the Ginn paper then was uploaded to at least two sites, one of which was Zerohedge, a website that sometimes pushes conspiracy theories. The venue of republication has some effect on readers’ perception of the article, just as the article’s presence on Medium might impact Medium’s reputation. Republication by Zerohedge may be reputationally poisonous, while an archive​.org link, as I have used above, merely indicates that the content in question is no longer available at its original source. The extent to which the perceived reputational effects of hosting and deplatforming drive the politics of content moderation is underappreciated.

From a libertarian perspective, everything seems in order at this point. A person expressed a controversial opinion and published it online via a popular blogging platform. Acting within its rights, the moderators of the platform took down the essay. They may have done so to avoid being associated with controversial and perhaps harmful speech. (To his credit, Ginn himself would later affirm that Medium and other platforms “are free to associate with whom they want.”) Meanwhile, the essay had prompted speech by Bergstrom countering its claims about the pandemic. The suppression of the essay related only to Medium. Everyone had a right to download the essay when it was on Medium, or from archive​.org after its removal. Readers had no legal obligation to refrain from reposting the essay elsewhere. Ginn’s article was available, counter speech sought to expose its shortcomings, and everyone retained the responsibility to make up their own minds about Ginn’s arguments.

Does speech misinforming people about the pandemic incite a kind of violence? Speech that misinforms people thereby convincing them to spread the COVID-19 which in turn infects some initially unidentified people who die or incur health care costs. I do not think such “incitement” meets the legal test for justifying criminalizing speech. The speech in question does not intentionally bring about imminent harm. But that incitement test applies to public not private authorities. Tech companies believe they are suppressing speech to halt the spread of the virus and attendant harms, fulfilling a public responsibility. In other words, they are balancing the value of some speech against the probability of it doing harm in the general population. In current circumstances, the platform’s antipathy to hoaxes and conspiracy theories seems justified. But doesn’t advocating a return of economic life by Easter pose a certain probability of doing harm to some people? How much speech threatens harm in current circumstances andbeyond?

Finally the potential costs of false positives by content moderators. Let’s imagine almost all the speech removed from the biggest platforms does threaten to harm some people. Yet inevitably content moderators will make mistakes especially if moderation by algorithm matters more in coming weeks. Imagine also that a contrarian offers an unexpected insight about the pandemic, one that could save lives. Once shared on social media, his idea might seem not just contrarian but dangerous. Moderators might then remove his post. It might then turn up almost immediately on a fringe site where the idea goes unnoticed and unconsidered. Will many people be saying in late July “if we had only known!” about the contrarian insight that would have saved lives?

Well, yes, they might be saying that later this year. But notice the contrarian idea was not suppressed. It appeared elsewhere; anyone could consider its arguments though most would stay clear of its marginal host. No system of social choice is perfect. But private content moderation beats public censorship even when the former suppresses speech that has great value. The nature of the internet means such suppression is never complete. Under a regime of piecemeal private moderation, it’s still possible that the valuable speech will be heard and heeded. Because platforms are open by default, and moderation occurs post‐​publication, even fringe ideas can get an initial hearing. Censorship seeks to make sure the relevant speech is neither heard nor heeded.

Our current crisis will not be good for free speech. Classical liberals may regret anyone suppressing speech even when justified. However, private moderators can legitimately suppress speech on social media. Indeed, leaders of the companies may feel they have a larger responsibility to suppress some speech during a pandemic. We should keep in mind that the suppressed speech will be removed from one platform and not the internet. It may also be stigmatized. That outcome will be better for speech than being censored and forgotten. We still might wonder how slippery the slope may be in defining “harmful speech” and how costly the moderators’ errors will turn out to be. Giving acceptable answers to those questions are also a part of the responsibility tech companies have to the larger public in this crisis and beyond.

This article by John Samples first appeared at CATO.

Image: School media department staff are editing online classes, following the outbreak of the coronavirus disease (COVID-19) in the holy city of Karbala, Iraq March 26, 2020. REUTERS/Abdullah Dhiaa Al-Deen