Skip to main contentSkip to navigationSkip to navigation
According to Facebook’s guidelines, a team of news editors are instructed on how to ‘inject’ stories into the trending topics module.
According to Facebook’s guidelines, a team of news editors are instructed on how to ‘inject’ stories into the trending topics module. Photograph: Adam Berry/Getty Images
According to Facebook’s guidelines, a team of news editors are instructed on how to ‘inject’ stories into the trending topics module. Photograph: Adam Berry/Getty Images

Facebook news selection is in hands of editors not algorithms, documents show

This article is more than 7 years old

Exclusive: Leaked internal guidelines show human intervention at almost every stage of its news operation, akin to a traditional media organization

Leaked documents show how Facebook, now the biggest news distributor on the planet, relies on old-fashioned news values on top of its algorithms to determine what the hottest stories will be for the 1 billion people who visit the social network every day.

The documents, given to the Guardian, come amid growing concerns over how Facebook decides what is news for its users. This week the company was accused of an editorial bias against conservative news organizations, prompting calls for a congressional inquiry from the US Senate commerce committee chair, John Thune.

The boilerplate about its news operations provided to customers by the company suggests that much of its news gathering is determined by machines: “The topics you see are based on a number of factors including engagement, timeliness, Pages you’ve liked and your location,” says a page devoted to the question “How does Facebook determine what topics are trending?”

But the documents show that the company relies heavily on the intervention of a small editorial team to determine what makes its “trending module” headlines – the list of news topics that shows up on the side of the browser window on Facebook’s desktop version. The company backed away from a pure-algorithm approach in 2014 after criticism that it had not included enough coverage of unrest in Ferguson, Missouri, in users’ feeds.

The guidelines show human intervention – and therefore editorial decisions – at almost every stage of Facebook’s trending news operation, a team that at one time was as few as 12 people:

  • A team of news editors working in shifts around the clock was instructed on how to “inject” stories into the trending topics module, and how to “blacklist” topics for removal for up to a day over reasons including “doesn’t represent a real-world event”, left to the discretion of the editors.
  • The company wrote that “the editorial team CAN [sic] inject a newsworthy topic” as well if users create something that attracts a lot of attention, for example #BlackLivesMatter.
  • Facebook relies heavily on just 10 news sources to determine whether a trending news story has editorial authority. “You should mark a topic as ‘National Story’ importance if it is among the 1-3 top stories of the day,” reads the trending review guidelines for the US. “We measure this by checking if it is leading at least 5 of the following 10 news websites: BBC News, CNN, Fox News, The Guardian, NBC News, The New York Times, USA Today, The Wall Street Journal, Washington Post, Yahoo News or Yahoo.”
  • Strict guidelines are enforced around Facebook’s “involved in this story” feature, which pulls information from Facebook pages of newsmakers – say, a sports star or a famous author. The guidelines give editors ways to determine which users’ pages are appropriate to cite, and how prominently.
facebook trending news guidelines

The company’s guidelines are very similar to a traditional news organization’s, with a style guide reminiscent of the Associated Press guide, a list of trusted sources and instructions for determining newsworthiness. (The Guardian also obtained the guidelines for moderating the “in the story” feature, now called “involved in this story”; the guidelines for the company’s Facebook Paper app; and a broader editorial guide for the app.)

The guidelines are sure to bolster arguments that Facebook has made discriminatory editorial decisions against rightwing media. Conservatives would label the majority of Facebook’s primary sources as liberal.

They also appear to undermine claims this week from Facebook’s vice-president of search, Tom Stocky, who posted a statement addressing the controversy on 9 May. “We do not insert stories artificially into trending topics, and do not instruct our reviewers to do so,” he wrote.

Stocky’s statement may depend on the definition of the word “artificially”. In interviews with the Guardian, three former editors said they had indeed inserted stories that were not visible to users into the trending feed in order to make the experience more topical. All denied personal bias, but all said the human element was vital.

A second list, of 1,000 trusted sources, was provided to the Guardian by Facebook. It includes prominent conservative news outlets such as Redstate, Breitbart, the Drudge Report and the Daily Caller.

Former employees who worked in Facebook’s news organization said that they did not agree with the Gizmodo report on Monday alleging partisan misconduct on the part of the social network. They did admit the presence of human judgment in part because the company’s algorithm did not always create the best possible mix of news.

Specifically, complaints about the absence from trending feeds of news reports about clashes between protesters and police in Ferguson in 2014 were evidence to Facebook that – in the specific case of the trending module – humans had better news judgment than the company’s algorithm. Multiple news stories criticized Facebook for apparently prioritizing Ice Bucket Challenge videos over the riots. Many said the incident proved that Twitter was the place for hard news, and Facebook was a destination for fluff.

“The guidelines demonstrate that we have a series of checks and balances in place to help surface the most important popular stories, regardless of where they fall on the ideological spectrum,” said Justin Osofsky, Facebook’s vice-president of global operations. “Facebook does not allow or advise our reviewers to systematically discriminate against sources of any political origin, period. What these guidelines show is that we’ve approached this responsibly and with the goal of creating a high-quality product – in the hopes of delivering a meaningful experience for the people who use our service.

“Trending Topics uses a variety of mechanisms to help surface events and topics that are happening in the real world. In our guidelines, we rely on more than a thousand sources of news – from around the world, and of all sizes and viewpoints – to help verify and characterize world events and what people are talking about. The intent of verifying against news outlets is to surface topics that are meaningful to people and newsworthy. We have at no time sought to weight any one viewpoint over another, and in fact our guidelines are designed with the intent to make sure we do not do so.”

More on this story

More on this story

  • Facebook's news saga reminds us humans are biased by design

  • Facebook trends: Zuckerberg invites top conservatives to talk and denies bias

  • Facebook controversy shows journalists are more complicated than algorithms

  • People think there is more leftwing news on Facebook, says study

  • US senator demands Facebook explain allegations it censors conservative news

  • Angry about Facebook censorship? Wait until you hear about the news feed

  • If Facebook hides conservative news, a Senate inquiry is a bad idea

  • Facebook accused of censoring conservatives, report says

Comments (…)

Sign in or create your Guardian account to join the discussion

Most viewed

Most viewed