Advertisement

Yes, Facebook’s Algorithm Is Biased. So Is Literally Everything.

CREDIT: AP PHOTO/BEN MARGOT
CREDIT: AP PHOTO/BEN MARGOT

Facebook’s alleged media bias has dominated headlines this week, following a Gizmodo report with anonymous sources claiming the employees deliberately suppressed news stories based on political ideology. Concern quickly escalated over whether Facebook is deciding what kind of news users read — specifically, forcing people to be more liberal by eschewing conservative content.

Facebook has denied the claims but has vowed to investigate and at least tentatively cooperate with a Congressional probe on the matter. And there’s no evidence that Facebook did anything out of the ordinary.

According to Facebook’s internal guidelines that were leaked to the Guardian, content curators act a lot like news editors do when choosing which content to feature on the front page.

Facebook news curators elevate stories based on whether the world’s top 10 media organizations — CNN, BBC, Fox News, the Guardian, NBC, the New York Times, USA Today, the Wall Street Journal, the Washington Post, and Yahoo — are covering them. They follow a style guide and dismiss certain topics or keywords, either permanently or temporarily, if they are redundant or can’t be tied to an actual news event. All of this is done in addition to Facebook using an algorithm to find top trending stories.

Algorithms Are Not Objective

Public outrage, however, centered around the notion that Facebook’s decision to use human judgment to pick news undermined the unbiased integrity of the technology the company holds most dear — the algorithm.

Advertisement

In a letter to Facebook CEO Mark Zuckerberg, Sen. John Thune (R-SD) wrote, “If Facebook presents its Trending Topics section as the result of a neutral, objective algorithm, but it is in fact subjective and filtered to support or suppress particular political viewpoints, Facebook’s assertion that it maintains a ‘platform for people and perspectives from across the political spectrum’ misleads the public.”

But the senator has it wrong: Algorithms are not objective.

An algorithm is essentially a formula or set of steps used to get something done, like a recipe, or in Facebook’s case, floating the most popular or clicked on stories to the top of your news feed. Facebook constantly tweaks its algorithm so the content users see first is the kind of stuff they like. The more you like, click, comment or share posts, the more Facebook is going to show you similar posts.

“Algorithms are meant to be gamed,” technologist and sociologist Zeynep Tufecki wrote in a blog post for the Message. “My Facebook friends have now taken to posting faux ‘congratulations’ to messages they want to push to the top of everyone’s feeds, because Facebook’s algorithm pushes such posts with the phrase ‘congratulations’ in the comments to top of your feed.”

Mind Over Media Bias

On top of algorithms feeding users’ preferences, they also reflect the unconscious biases of the algorithms’ creators.

Advertisement

“The biases [of the coder] are going to come out in the algorithm whether the creator knows it or not,” said Deen Freelon, associate communication studies professor at American University in Washington, DC.

Algorithms know the other user accounts you visit and the news stories you’re most likely to click on. But they’re imperfect, Freelon said. Facebook’s algorithm, by its own admission, would often miss trending topics like a bombing or the Ferguson protests because no one on the site is talking about it yet.

As Tulfecki points out, “An algorithm can perhaps surface guaranteed content, but it cannot surface unexpected, diverse and sometimes weird content exactly because of how algorithms work: they know what they already know.”

But Facebook’s media bias scandal isn’t about the technology. “What people are really upset about is that they really thought Facebook was different,” Freelon said.

“Facebook has a clear point of view on certain issues,” he said, referring to the company’s stance on privacy and government surveillance, immigration and Zuckerberg’s personal political leanings. “Why not this?” Because Facebook has taken positions on public policy issues, its leadership and employees likely fall in line with its values.

Conservatives latched onto claims that the social network suppressed news that supported their ideology in favor of the liberal media. But their quest for proof of a liberal media bias might be more conspiracy theory than a documented social trend.

Advertisement

“It’s a subjective phenomenon,” Freelon said. “You get two partisans looking at the same piece of news content and each can see it as biased toward the other side. Media bias isn’t a thing that’s out there in the world, it’s in our heads,” that is to say, it’s in readers’ perspectives.

“Even if you’re straight down the middle you’re going to get that accusation… Conspiracy theory as it applies to the media, gives enough evidence for both sides that can convert the other.”

But it’s not all in people’s heads, Freelon said. There are some media outlets with an explicit political slant. And their audiences or those who have a skeptical view of mass media may more readily perceive media bias.

“The tilt can be obvious for some outlets, that’s when you tend to see bias defined as a consistent preference of a certain viewpoint,” he said using Fox News as an example, which for the record was one of the top 10 outlets Facebook cites.

The narrative that develops as a result is any news that doesn’t align with a blatantly conservative outlet is automatically against it and a part of the liberal media.

Republicans’ attack on Facebook’s alleged media bias has more to do with users — and their influence — than the machinations of trending news. More than half of American users get presidential election news on Facebook, but 79 percent identify as Democrats.