Facebook scrambled this week to respond to a new and startling line of attack: accusations of political bias. The outcry was set off by a report on Monday morning by the website Gizmodo, which said Facebook's team in charge of the site's "trending" list had intentionally suppressed articles from conservative news sources. The social network uses the trending feature to indicate the most popular news articles of the day to users.
Facebook denied the allegations after a backlash – from both conservative and liberal critics – erupted. "It is beyond disturbing to learn that this power is being used to silence viewpoints and stories that don't fit someone else's agenda," read a statement from the Republican National Committee. "NOT LEANING IN... LEANING LEFT!" blared the top story on the Drudge Report, a widely read website.
The journalist Glenn Greenwald, hardly a conservative ally, weighed in on Twitter: "Aside from fueling right-wing persecution, this is a key reminder of dangers of Silicon Valley controlling content." And Alexander Marlow, the editor in chief of Breitbart News, a conservative-leaning publication, said the report confirmed "what conservatives have long suspected".
Facebook, in response, says that it follows rigorous guidelines “to ensure consistency and neutrality” and that it works to be inclusive of all perspectives. “We take allegations of bias very seriously,” a Facebook spokeswoman said in a statement. “Facebook is a platform for people and perspectives from across the political spectrum.”
News source
The back-and-forth highlights the extent to which Facebook has muscled its way into the United States's political conversation – and the risks that the company faces as it becomes a central force in news consumption and production. With more than 222 million monthly active users in the United States and Canada, the site has become a place that people flock to to find out what is going on. Last year, a study by the Pew Research Center, in collaboration with the Knight Foundation, found that 63 per cent of Facebook's users considered the service a news source.
In April, Facebook embraced this role openly, releasing a video to implore people to search Facebook to discover “the other side of the story”. Politicians have increasingly shared their messages through the social network. “It’s not that Facebook has changed fundamentally over the past four, eight years,” said Paul Brewer, director of the University of Delaware Center for Political Communication. “It’s the sheer volume of communication that’s taking place, and it’s that politicians know that they need to be using Facebook now more than ever before to communicate.”
As it has become more influential, Facebook has taken pains to say that it is not an echo chamber of similar opinions. In a peer-reviewed study published last year, Facebook’s data scientists analysed how 10.1 million of the most partisan American users on the social network navigated the site over a six-month period. They found that people’s networks of friends and the articles they saw were skewed toward their ideological preferences – but that the effect was more limited than the worst case some theorists had predicted, in which people would see almost no information from the other side.
Yet Gizmodo’s report raises questions about the effects that Facebook’s staff members and their biases, even unconscious ones, have on the social network. While Facebook has pledged to sponsor both the Democratic and Republican national conventions in the US, its top executives have not been shy about expressing where their political sympathies lie.
At a Facebook conference in April, Mark Zuckerberg, the company's chief executive, warned of "fearful voices building walls", in reference to Donald Trump, the probable Republican presidential candidate. The allegations against Facebook also put the spotlight on how it chooses which news articles to show users under the trending function – on desktop computers, "trending" displays on the right side of screens; on cellphones, it appears when users search.
Newsroom operation
Facebook has long described its trending feature as largely automatic. “The topics you see are based on a number of factors including engagement, timeliness, pages you’ve liked and your location,” according to a description on Facebook’s site.
The trending feature is curated by a team of contract employees, according to two former Facebook employees who worked on it and who spoke on the condition of anonymity because of nondisclosure agreements. They said they considered themselves members of a newsroom-like operation, where editorial discretion was not novel but was an integral part of the process.
Any “suppression,” the former employees said, was based on perceived credibility – any articles judged by curators to be unreliable or poorly sourced, whether left-leaning or right-leaning, were avoided, though this was a personal judgment call.
The perception of Facebook as a more conventional news operation opens it to a more familiar line of criticism, which has been mounted against news organisations left and right, large and small, for decades. According to a report last year by Pew, only 17 per cent surveyed said that technology companies had a negative influence on the country. For the news media, that number was 65 per cent – and rising.
“The agenda-setting power of a handful of companies like Facebook and Twitter should not be underestimated,” said Jonathan Zittrain, a professor of computer science and law at Harvard University. “These services will be at their best when they are explicitly committed to serving the interests of their users rather than simply offering a service whose boundaries for influence are unknown and ever-changing.”
By late Monday, users on the social network looking for more information about the Gizmodo report did not have to look far: It was among the top articles trending on Facebook.
New York Times