Monday, November 07, 2016

Facebook is harming our democracy, and Mark Zuckerberg needs to do something about it

Timothy B. Lee · Sunday, November 06, 2016, 10:37 am

A generation ago, newspapers and television news programs had a lot of influence over what people read and watched. Stories that made it on the front page got a lot of attention, while most people never heard about stories that mainstream media outlets chose to ignore.

Things are very different on the modern internet. Most people today don’t get their news by going to the home page of CNN or the New York Times. They open a social media app — most often Facebook — and read news stories that pop up in their news feed.

The result has been a disaster for the public’s understanding of current affairs. Reporters have come under increasing pressure to write “clickbait” articles that pander to readers’ worst impulses. Too-good-to-check stories gain more traction online than stories that are balanced and thoroughly reported. That has worsened the nation’s political polarization and lowered the quality of democratic discourse.

The fundamental problem here is that Facebook’s leadership is in denial about the kind of organization it has become. “We are a tech company, not a media company,” Zuckerberg has said repeatedly over the last few years. In the mind of the Facebook CEO, Facebook is just a “platform,” a neutral conduit for helping users share information with one another.

But that’s wrong. Facebook makes billions of editorial decisions every day. And often they are bad editorial decisions — steering people to sensational, one-sided, or just plain inaccurate stories. The fact that these decisions are being made by algorithms rather than human editors doesn’t make Facebook any less responsible for the harmful effect on its users and the broader society.

Facebook makes editorial judgments all the time

Hillary Clinton Campaigns In Salinas, California Photo by Justin Sullivan/Getty Images
It’s easy to lump all social media together, but there’s a crucial difference between Facebook and Twitter, the two social media sites that people most often use to find news.

Twitter really is a neutral platform. Each Twitter user chooses a list of people to follow. When you log into Twitter you see (with a few minor exceptions) a list of recent tweets by those people in strict chronological order. So Twitter can plausibly argue that it’s not responsible for the stories users see.

Facebook — which is now vastly more popular than Twitter — is different. The order of posts in the Facebook news feed is chosen by a proprietary Facebook algorithm. This algorithm takes into account a variety of factors, like how close you are to the poster, how many times a post has been shared or liked by other Facebook users, the type of post (wedding and baby announcements seem to bubble up to the top), and so forth. And then it chooses the posts it thinks you’re most likely to enjoy — whether they were posted three minutes or three days ago — and puts them at the top of your news feed.

Most of us will only ever see a fraction of the things our friends post on Facebook, so the ability to decide which posts to show first amounts to the power to control which posts users read at all.

It’s easy to think of an algorithm as an alternative to making messy and imprecise human judgments. But as Slate’s Will Oremus pointed out earlier this year, that’s a mistake:

The intelligence behind Facebook’s software is fundamentally human. Humans decide what data goes into it, what it can do with that data, and what they want to come out the other end. When the algorithm errs, humans are to blame. When it evolves, it’s because a bunch of humans read a bunch of spreadsheets, held a bunch of meetings, ran a bunch of tests, and decided to make it better.

Read more

No comments: