Facebook, in response to a report that editors manipulated its “trending” news sections by keeping out politically right-wing stories and inserting stories that weren’t actually trending, has released a 28-page document—its guidelines for the section.
The Trending Review Guidelines were released in a May 12 blog post by Justin Osofsky, Facebook vice president of global operations, hours after they were leaked to The Guardian.
“Our goal has always been to deliver a valuable experience for the people who use our service,” wrote Osofsky. “The guidelines demonstrate that we have a series of checks and balances in place to help surface the most important popular stories, regardless of where they fall on the ideological spectrum. Facebook does not allow or advise our reviewers to discriminate against sources of any political origin, period.”
Gizmodo reported May 9 that Facebook workers said they “routinely suppressed news stories of interest to conservative readers” and specifically blocked stories about Mitt Romney, Rand Paul and right-wing gatherings.
Gizmodo also reported that trends were determined by the editor on duty, and that the trending news team is a small group of young journalists, primarily Ivy League educated.
“Depending on who was on shift, things would be blacklisted or trending,” a former so-called news curator told Gizmodo.
Mark Zuckerberg (pictured) responded to the accusations in a Facebook post May 12.
“To serve our diverse community, we are committed to building a platform for all ideas. Trending Topics is designed to surface the most newsworthy and popular conversations on Facebook,” he wrote.
Addressing the Gizmodo accusations, he said no evidence has been found to support that they’re true.
“If we find anything against our principles, you have my commitment that we will take additional steps to address it,” Zuckerberg continued. “In the coming weeks, I’ll also be inviting leading conservatives and people from across the political spectrum to talk with me about this and share their points of view. I want to have a direct conversation about what Facebook stands for and how we can be sure our platform stays as open as possible.
The Facebook Guidelines make clear that a number of teams are involved in the selection of trending topics.
An editorial team accepts topics that reflect real-world events and “provide context to help people understand the trend and metadata to inform the algorithms that target trends.”
A “topic detection team” surfaces pending topics and ranks them after they’re accepted; and a content ranking team is responsible for delivering topic feeds once a topic is accepted.
Much of the Overview reads like standard media guidelines at any publishing organizations, offering style guidelines (don’t capitalize prepositions) and warning against clichés. There is also guidance for what to blacklist (duplicate topics and items unrelated to a real-world event) and when to “inject” a topic.
Topics can be injected to consolidate or clean up a story (an offered example is that “ISIS” could replace “Flames of War”), as well as to promote a topic that’s in the “demo tool” but not the “review tool.” If it’s in neither tool, it can’t be added.
“The guidelines do not permit the suppression of political perspectives,” Osofsky explained in his blog post.
Facebook Shares Trending Guidelines, Amid Reports of Bias
“About 40 percent of the topics in the queue get rejected by the reviewers because they reflect what is considered ‘noise’ … For example, braised, DVD, #weekend and #sale are all topics that were not accepted as trends over the past week. This tool is not used to suppress or remove articles or topics from a particular perspective,” he continued.
That Facebook should find itself showing, as a New York Times Headline explained, “How Editors and Algorithms Guide News” suggests a societal learning curve under way, regarding the reality of an increasingly common balancing act between human input and algorithms, as data science becomes an ever-greater fact of life.
Last year, after the introduction of its digital assistant, M, Facebook made headlines over whether M was entirely machine-driven or if humans were also at work. (It was indeed the latter.)
“I think that the general perception is that the output of algorithms and analytics is informed by human judgment in its design, but also that it would be crazy to act on that output without the further exercise of human judgment,” Ezra Gottheil, a principal analyst with Technology Business Research, told eWEEK.
“Trending topics” does sound like something automatic—a “mere counting of occurrences,” said Gottheil.
“Upon consideration, however, one would realize that Facebook is in the business of keeping people engaged, and that highly controversial topics may not serve its purpose,” he continued. “The Facebook page is Facebook’s product; not to inspect it before it ships would be business malpractice. Microsoft would have been less embarrassed if it had subjected its Tay bot to adult supervision.”
Is it possible some people were exercising their biases? Sure.
“The examples given, however, seem reasonable to me. For some people, the ones complaining about the ostensible bias, the Drudge Report is mainstream and The New York Times is fringe. I think the majority of users would think otherwise.”