Over the last five years or so, there’s been much consternation over the role Facebook plays in political discourse. There were massive amounts of disinformation that was spread during the 2016 election campaign, some due to foreign interference, and others due merely to bad actors screwing with the algorithm. That was also the year of the Cambridge Analytica scandal.
In the 2020 election, Facebook made several efforts to root out disinformation on its platform related to both the election and coronavirus, ultimately suspending then-President Donald Trump’s account following the riot at the Capitol on Jan. 6. This has led to complaints from all sides of the political spectrum.
This week, Facebook announced some changes to how its Newsfeed will look.
First, on Monday, Facebook announced a crackdown on misinformation about vaccines.
“Today, we are expanding our efforts to remove false claims on Facebook and Instagram about COVID-19, COVID-19 vaccines and vaccines in general during the pandemic,” the company said in an announcement. While Facebook has been removing some false claims about vaccines since the first coronavirus vaccines were introduced late last year, that policy has been expanded, to include several claims: That coronavirus is man-made, that vaccines are not effective, that it’s safe to get the disease than the vaccine, and that vaccines are “toxic, dangerous or cause autism.”
“We will begin enforcing this policy immediately, with a particular focus on Pages, groups and accounts that violate these rules, and we’ll continue to expand our enforcement over the coming weeks,” the company said. “Groups, Pages and accounts on Facebook and Instagram that repeatedly share these debunked claims may be removed altogether.”
And then, on Wednesday, Facebook made another announcement about plans to test “reducing political content in News Feed.”
The post, by Facebook product management director Aastha Gupta, referenced Mark Zuckerberg’s recent statement on an earnings call that “people don’t want political content to take over their News Feed.” Therefore, Facebook is taking steps to listen to those complaints.
“Over the next few months, we’ll work to better understand peoples’ varied preferences for political content and test a number of approaches based on those insights.”
The company will start by temporarily reducing the “distribution of political content” in News Feed in Canada, Brazil and Indonesia, with the change coming to the United States in the coming weeks.
“To determine how effective these new approaches are, we’ll survey people about their experience during these tests. It’s important to note that we’re not removing political content from Facebook altogether,” the post said. "Our goal is to preserve the ability for people to find and interact with political content on Facebook, while respecting each person’s appetite for it at the top of their News Feed.”
Stephen Silver, a technology writer for The National Interest, is a journalist, essayist and film critic, who is also a contributor to Philly Voice, Philadelphia Weekly, the Jewish Telegraphic Agency, Living Life Fearless, Backstage magazine, Broad Street Review and Splice Today. The co-founder of the Philadelphia Film Critics Circle, Stephen lives in suburban Philadelphia with his wife and two sons. Follow him on Twitter at @StephenSilver.