Sometimes We Need Human Employees, Facebook Admits

Well, that's that.
GettyImages472588146
Getty Images

In typical Facebook fashion, Facebook wants you to know it is a transparent company, one "that stands for connecting every person" in the world. In equally typical fashion, it's telling you this without telling you much at all.

The company published a blog post today explaining how its "trending topics" feature works. "Topics that are eligible to appear in the product are surfaced by our algorithms, not people," Justin Oskofsky, Facebook's vice president of global operations, writes. "This product also has a team of people who play an important role in making sure that what appears in trending topics is high-quality and useful." (A comical statement to anyone who has taken a peek at their "trending topics" box.)

The post comes amid days of widespread concern about the feature. On Monday, Gizmodo published an article titled "Former Facebook Workers: We Routinely Suppressed Conservative News." It cited former employees of the trending team who said Facebook instructed them to act like "news curators" and told them they could inject certain stories into the trending section.

The story prompted outcries among journalists. Republican Senator John Thune sent a public letter to Facebook, asking the company to explain whether it had "manipulated content." And today, The Guardian released internal Facebook documents outlining guidelines for the "trending" team, indicating that editors are in fact involved in deciding and describing those trending topics.

It seems Facebook wants to clear the air. "Facebook does not allow or advise our reviewers to discriminate against sources of any political origin, period," Oskofsky writes.

So how does Facebook determine what topics end up in the "trending" section? The company says potential topics are identified by an algorithm that picks up topics spiking in popularity on Facebook and by using an external RSS crawler of sites listed here. A team of people reviews those topics to determine whether they fit the company's criteria, including things like making sure the news is corroborated by several outlets.

Oskofsky explains that the trending section is then personalized with an algorithm for each Facebook user based on things like where that person lives, what kinds of Facebook pages that person likes, and how that person has engaged with trending topics in the past.

Within the blog post, Oskofsky linked to the trending team's "review guidelines," which seem to serve as process and style recommendations. Yet he does not explain the extent to which the trending team makes value judgments—in fact, editorial judgments—about the stories to include or exclude from "trending." He doesn't say who uses "trending" or how often people click there. He also doesn't explain the extent to which Facebook itself guides the editorial team making those bigger picture decisions.

But one thing is clear: an algorithm isn't always enough. Even Facebook needs humans sometimes.