Mark Zuckerberg has faced fiery, pervasive criticism for not wanting his social network to decide what is true, at least when it comes to politics. On Thursday, as the 2020 presidential campaign hurtled toward the first primaries and caucuses, Facebook doubled down on that policy.
Rob Leathern, the company’s director of product management, wrote, “We have based [our policies] on the principle that people should be able to hear from those who wish to lead them, warts and all, and that what they say should be scrutinized and debated in public.”
But what Facebook defines as newsworthy may be different than how journalists, tasked with reporting facts, do so. And whether reporters can even keep pace with the sheer volume of political speech on such a sprawling platform in a time of industry consolidation, job cuts, and widespread disinformation is far from certain.
“Newsworthiness is an editorial decision. Journalists decide it by finding and verifying information that their readers can trust is accurate,” said Ryan Thomas, associate professor of journalism studies at the University of Missouri. “Facebook is not playing that role. It’s taking a hands-off approach. But the absence of fact-checking is an editorial decision. Newsworthy information can’t be any old lie.”
“The idea that there are enough journalists out there to fact check all the false claims on Facebook is naive,” Thomas added.
After all, 2019 was not kind to the ranks of employed journalists in America. Newsroom employment in the U.S. declined an historic 25 percent over the year, according to the Columbia Journalism Review—3,160 journalists, editors, and newsroom staffers lost their jobs. Critics often blame Facebook and Google, particularly the former for its errors in measuring the video views many publishers spent scant resources chasing with a now-comical series of “pivots to video.”
“Facebook’s influence on journalism has been disastrous. By extension, so has its influence on democracy,” Thomas said.
Facebook did not respond to a request for comment for this story.
Facebook’s own fact-checking partners have said they can’t keep up with the firehose of falsehoods on the platform. Donald Trump has run more than 50,000 ads in the past 90 days, 14,000 in the past week alone, according to the Facebook Ad Library. Rep. Kevin McCarthy (R-CA) promoted a debunked conspiracy theory about the impeachment inquiry in a Facebook ad at the end of September. There are 434 other members of the House of Representatives.
And that’s just federal elected officials. Pete Buttigieg has run 43,000 Facebook ads in the last 90 days, Elizabeth Warren 15,000, 13,000 for Bernie Sanders. Candidates who aren’t frontrunners in the polls may enjoy the benefit of flooding the internet without much scrutiny: Michael Bloomberg has purchased 19,000 ads in the last 90 days, according to the Facebook Ad Library. Tom Steyer, a candidate who didn’t even qualify for the January Democratic debate, has published 2,600 ads in the past 90 days. Bloomberg has spent $100 million on campaign ads; Steyer has pledged to spend as much. Whose priority is it to comb through them all, and who has time to do so?
Under fire from lawmakers and his own employees, Zuckerberg said in November that Facebook aimed to “err on the side of greater expression,” echoing the doctrine of “more speech, not enforced silence” made famous by Supreme Court Justice Louis Brandeis.
Facebook framed the policy decision as one that preserves newsworthy speech, leaving responsibility up to voters. The company described newsworthiness as a force that can override the social network’s own enforcement policies in a September 2019 blog post: “If someone makes a statement or shares a post which breaks our community standards we will still allow it on our platform if we believe the public interest in seeing it outweighs the risk of harm,” wrote Nick Clegg, Facebook’s VP of global affairs and communications.
The company determines potential public interest by evaluating “country-specific circumstances,” the “nature of the speech,” and the “political structure of the country.” More imminent harm like immediate incitement of violence may make Facebook more likely to remove a piece of content, Clegg wrote.
Thomas pointed out that the company censors content in other countries—notably India, also a democracy—which “screams of hypocrisy,” he said.
“Facebook is terrified of being accused of partisan bias. What it’s instead decided to do is tip the marketplace in favor of lies,” said Thomas. “The logic of that point is that Facebook has no responsibility, but in fact it’s a failure of responsibility.”
Facebook has long made a show of policing misinformation, but researchers say it and other tech giants abrogate responsibility for lies and leaves journalists to pick up the slack. As Dr. Joan Donovan, who studies media manipulation with the nonprofit Data & Society and lectures at Harvard, said in a congressional hearing Wednesday, “A single manipulation campaign can create an incredible strain on breaking news cycles, turning many journalists into unpaid content moderators.”
In another test of its policies, the company announced Monday it would ban videos manipulated by artificial intelligence, often called “deepfakes”. But it might not remove a politician’s’ post containing one.
“If a politician posts organic content that violates our manipulated media policy, we would evaluate it by weighing the public interest value against the risk of harm,” a Facebook spokesperson told The New York Times. Facebook will not allow deepfakes in ads at all, a spokesperson told The Verge.
Facebook has forged partnerships with news organizations in some cases; it now pays some publishers for their content in the Facebook News tab. In early 2019, the company promised to give away $300 million over three years to local news organizations across the United States and Europe.
Thomas called these efforts well-intentioned. But, he added, given the responsibility Facebook has thrust on an already embattled press corps facing new challenges virtually every day, some because of the company’s actions, they amount to ”a drop in the ocean.”