
A giant Facebook “like” seen at the company’s new headquarters in Menlo Park, California. Photo by Robert Galbraith/Reuters
Editor’s note: During our Facebook Live discussion with The Poynter Institute over Facebook’s role as a gatekeeper of news, The Guardian published documents that showed how the company’s Trending Topics team relied on human intervention to decide what was newsworthy. Hours after The Guardian leaked the guidelines, Facebook told Buzzfeed the documents were an “older version” and posted the updated guidelines. On Friday, Facebook CEO Mark Zuckerberg responded with his own post..
Last week, a report surfaced claiming that former Facebook employees suppressed conservative stories on the world’s largest social network. If true, this counters the company’s long-standing claims that it is a neutral news source.
In its internal policy, Facebook says the site’s Trending Stories section is curated by an objective algorithm. But the report, which cites former Facebook workers, says that it’s not just the algorithm curating the stories. There are also human workers “injecting” some stories and suppressing others from this section, they say.
Facebook has denied the allegations, saying there’s “no evidence that the anonymous allegations are true.” But the report has caught the attention of the social media community and Senate Republicans.
What are the ethics surrounding Facebook’s involvement in journalism? What level of transparency should platforms like Facebook provide?
We had a Facebook Live discussion, moderated by PBS NewsHour Weekend anchor Hari Sreenivasan, on Facebook’s role as a distributor and gatekeeper of news. We were joined by Kelly McBride, vice president and media ethicist of The Poynter Institute, whose wrote this column on the subject.
Here are a few points to consider:
WHAT ARE THE ACCUSATIONS?
Gizmodo interviewed a handful of former news curators who said they routinely omitted ring-wing news sources from the “trending” section in the upper right-hand corner of the site due to the political bias of the staffers.

An example of the Trending Topics box that appears at the upper right-hand corner of the site.
Stories about the Conservative Political Action Conference (CPAC), Mitt Romney, and Wisconsin Gov. Scott Walker were left off the list “because either the curator didn’t recognize the news topic or it was like they had a bias against Ted Cruz,” one anonymous curator told Gizmodo.
This also meant, according to some curators, that news stories from conservative outlets like Breitbart and Washington Examiner weren’t considered for inclusion unless The New York Times or the BBC had covered the same stories too.
Additionally, curators said they used an “injection tool” that included stories in the “trending” section that normally wouldn’t have been picked up by Facebook’s algorithm. One curator told Gizmodo that “Facebook got a lot of pressure about not having a trending topic for Black Lives Matter.”
But what’s the value of being exposed to other viewpoints differing from your own?
“It prepares you to be a better citizen in a democracy,” McBride said. “It prepares you to have conversations with people who have different viewpoints. It also helps you analyze and solidify your own positions, even if you don’t agree with them.”
WHAT WAS FACEBOOK’S RESPONSE?
On Friday, chief executive Mark Zuckerberg posted a statement on Facebook, stating that the company “found no evidence that this report is true.”
“We have rigorous guidelines that do not permit the prioritization of one viewpoint over another or the suppression of political perspectives,” Zuckerberg said in the statement, adding that Facebook will conduct a full investigation and take “additional steps” to address anything that countered the company’s principles.
Earlier in the week, Tom Stocky, the head of Facebook’s Trending Topics, responded to the allegations, saying there are “rigorous guidelines in place for the review team to ensure consistency and neutrality.”
He also said the company takes the allegations “extremely seriously.”
“What we don’t know is what Facebook was doing that caused the people on that team to believe they were supposed to screen out conservative viewpoints,” McBride told the NewsHour. “Was it a memo? Was it somebody who directly told them to do that? Was it another interpretation of another policy?”
South Dakota Sen. John Thune sent a letter to Zuckerberg seeking similar answers. He asked for more clarification on what creators on the trending team do, what standards they adhere to and what steps the company was taking to address the problem.
Thune asked to receive the requested information by May 24.
On Thursday, Facebook posted 28 pages of internal guidelines for its trending team.
In a post on Facebook, Justin Osofsky, vice president of the company’s global operations, said the “guidelines demonstrate that we have a series of checks and balances in place to help surface the most important popular stories, regardless of where they fall on the ideological spectrum.”
“With Facebook, humans are never not involved,” Robyn Caplan, a research analyst at Data & Society told the Times. “Humans are in every step of the process — in terms of what we’re clicking on, who’s shifting the algorithms behind the scenes, what kind of user testing is being done, and the initial training data provided by humans.”
Issie Lapowsky of Wired wrote that Facebook’s stance of neutrality comes with a whole new set of considerations for the platform.
“If tech companies are now playing the role that traditional publishers have for centuries, then they need to begin having the same conversations about transparency and disclosure, ethics and fairness,” Lapowsky wrote. “With election season pressure rising, it looks like those conversations may happen sooner than later.”
NEXT STEPS FOR FACEBOOK
McBride said Facebook needs to be more transparent about the role that both humans and the algorithm play in news selection. A public editor or ombudsman that independently answers questions from the public could help, she added.
“If Facebook is smart, they’ll appoint a public facing manager to explain and enhance the policies they have in place, and maybe even create some new policies to garner public faith,” McBride said in an email.
The post If humans are curating your Facebook news, does that mean there’s bias? appeared first on PBS NewsHour.