The interim leader of TikTok, Vanessa Pappas, has just proposed that social media companies agree to warn one another about violent, graphic content on their platforms. Specifically, TikTok proposes a “hashbank for violent and graphic content” with a special concern about suicide videos. The company believes the hashbank and subsequent cross‐platform suppression of the objectionable content would “significantly reduce the chances of people encountering it and enduring the emotional harm that viewing such content can bring.”
As it happens I came near some violent and graphic content this morning. A friend sent me a link to a BBC News story about “Cameroon soldiers jailed for killing women and children.” The embedded video, whose label says it “contains disturbing scenes”, apparently depicts the murders of two women and children. The video might be a candidate for TikTok’s proposed hashbank. Using the BBC video to fix ideas, let’s examine the costs and benefits of TikTok’s proposal.
Let’s begin with where TikTok is on solid ground. Some of their users are younger people and thus may be protected from extreme speech in ways that adults should not be. But preventing adults from seeing content they wish to see is paternalistic. But there’s another wrinkle here. Tiktok mentions people “encountering” objectionable content. The dictionary tells us to encounter is “to come upon or experience especially unexpectedly.” If I choose to see risky content, it may be surprising but the risk is part of the choice even if it turns out to be more than I would have wanted to see ex ante. Encountering content seems more like being algorithmically chosen to view content than personally choosing to see it.
Imagine the BBC video at the link began running once I loaded the page, and I viewed the murders. Perhaps I should see the murders. All of us have encountered people or events or ideas that have changed our lives for the better even though we did not choose any of them or if asked beforehand, would have declined the opportunity to meet a person or see something horrific. Encountering the deaths of four people might make me a better person or even a better employee for Cato or the Oversight Board. I would perhaps better appreciate human frailty or the sad lot of much of humanity.
But I didn’t watch the video. Had that video run after loading and I encountered the murders, I might suspect BBC editors had become paternalists bent on improving my moral fiber through hard lessons. A focus on “encounters” rather than choice improves the case for having a hashbank that crosses platforms.
TikTok’s proposal does have risks for speech. Return again to the original story. Initially Camaroonian officials said the events depicted in the video were “fake news.” BBC notes that the video gained international attention and millions of views on Twitter. If the TikTok group had put the video into the new hashbank and removed it from participating platforms, such international attention might not have happened. Some might have believed the murders were indeed “fake news.” The four soldiers involved might not have received 10 years each in prison for the killings.
You might say all of that is unlikely given the brutality depicted in the video. But much goes on in our world, more than a little of it brutal or tragic. Unseen might well have been unknown in this case. A hashbank exists to guide takedowns from a platform not to prompt deliberation over whether the content inside should be left up or taken down. At first and even later, some companies might refuse to suppress content from the hashbank on grounds of “newsworthiness.” But with time, the stigma attaching to the material in the hashbank will apply also to disturbing content that should be seen and heard. If the hashbank did not exist, companies might disagree more about graphic and violent content. Creating a single standard for bad content means everyone will make the same mistake.
And some mistakes may be more than an inevitable tradeoff between false positives and false negatives. The Cameroonian government officials could not compel BBC News to bury the shocking video. Who will decide which content goes into the TikTok hashbank? No doubt many government officials across the world will be happy to see their citizens protected from graphic content. But they might also be happy to see content removed that cuts against their interests. Many credible experts believe TikTok reflects the interests of the Chinese government. Might a video of a small crackdown in 2023 in China’s Xinjiang region find its way into the hashbank?
How willing would American companies be able to resist U.S. officials seeking to “protect” social media users from videos of official misconduct? Social media companies in the United States are facing a tough few years of government oversight and potential regulation.They might not say “no” to officials. Indeed, if the content is graphic and a video’s implications are unpopular, the easiest way forward might be the hashbank. The leaders of other nations will at one time or the other have similar interests in “protecting” their citizens from violent and graphic content.
Of course, we are some distance from our starting point. BBC did not suppress the Cameroonian video; it put the killings behind a warning label. I don’t think BBC News limited my rights by giving me a choice whether or not to see those murders. Others (or the parents of young people) may also wish to be protected in this limited way. (The problem of offering stronger protections to young people will not be solved by a label). Social media may act reasonably as our agents to prevent encounters (but not selection) of violent graphic content. But the hashbank seems unlikely to foster labeling as opposed to suppression of images.
Many social media users might wish to avoid unexpected encounters with graphic violent content. They might hope their favorite platform will help them avoid such content while leaving users free to choose what they see. A single hashbank of graphic and violent images might facilitate social media companies helping their users in this way. But concentrating power always offers risks as well as benefits. TikTok’s proposal may suppress images that are both graphic and valuable. Perhaps no one should be forced to see disturbing valuable images, but they should be available somewhere for those strong enough to care. The proposed hashbank will increase the incentives for governments and organized interests to try to suppress inconvenient images. As always, social media are free to pursue ideas as they wish. But in this case the risks to speech counsel caution.
This article first appeared in the Cato at Liberty blog, a publication of the Cato Institute.