From the Editors

Facebook’s problems are much deeper than “bad actors”

Bad for democracy, good for business

During the 2016 election, political disinformation proliferated on Face­book. Since then the social media giant has been on the defensive about its harmful effects on democracy. Recently it has been publicizing its efforts to go after individuals who routinely create fake accounts, engage in hate speech, or otherwise violate the company’s community standards. Facebook often refers to such repeat offenders as “bad actors.”

While the suggestion that a few people are bad actors and the rest of us are good ones is overly simple, it does help Facebook frame its democracy problems as bugs in an otherwise beneficent system. In fact these problems are part and parcel of the company’s core business of attracting people’s attention, collecting their personal information, and selling both to advertisers. Disinformation aimed at the credulous, political ads that precisely target bias and fear, the “filter bubble” that erases dissent from your news feed—these are indications that Facebook works as designed. It’s keeping its users engaged, its advertisers happy, and its stock profitable. The damage Facebook has done to democracy is rooted in the very things that make it successful.

What’s more, much of this success has come at the direct expense of one pillar of democracy: news journalism. This was true even before it became the case that news publishers’ web traffic lived or died at Facebook’s algorithmic whim. Traditional news sources have long trafficked in verifiable information, professionally vetted; many of them serve broad audiences across cultural divides. This is in sharp contrast to what Facebook provides—yet Facebook has taken over vast swaths of the consumer attention and advertiser business that news journalism once enjoyed. Facebook, where it’s hard to know what to believe or which voices are missing, is now many people’s primary source of news.

If Facebook cares about democracy, it should invest in it. Brooke Binkow­ski, a journalist who participated in Facebook’s 2016 fact-checking effort, has some proposals on this front. Writing in USA Today, she calls on Facebook to give users tools to choose whether and how its algorithms determine what we see. It should establish well-defined rules against disinformation and then empower moderators to take certain false posts down—rather than appealing to free speech as a reason to tolerate disinformation and profit from it, as with the recent fake video of House Speaker Nancy Pelosi slurring her words. And it should set up a fund to help local newsrooms do their job. “To support a free press is to support free speech,” writes Binkowski.

Facebook could make a big difference for democracy if it took ideas like these seriously. The rest of us could make a smaller difference by getting off Facebook and subscribing to a newspaper. The company wants us to believe that it’s dealing with the “bad actors,” the people who can’t be trusted to use the network as intended. But given that its intended purpose is to get our attention and, using an opaque algorithm, turn it into ever more clicks and ad dollars, there’s no reason to trust Facebook until it makes deeper changes.

A version of this article appears in the print edition under the title “Facebook vs. democracy.”