Facebook wants to fight fake news with facts

But will it work?

Facebook CEO Mark Zuckerberg at the Facebook Communities Summit in Chicago in June. CREDIT: AP Photo/Nam Y. Huh
Facebook CEO Mark Zuckerberg at the Facebook Communities Summit in Chicago in June. CREDIT: AP Photo/Nam Y. Huh

Facebook wants to arm its users with facts. The social network announced on Thursday it will begin a broad roll out of a new feature called “Related Articles” which seeks to debunk posts that perpetuate hoaxes and conspiracy theories.

In the never-ending battle to fight fake news, Facebook has set out to give users tools to help them sift through the steady stream of links — many unverified — that come through their news feeds. In this latest update, Facebook has tweaked its algorithm to spot potential fake reports based on consumer feedback and comments on the posts. Those stories are then verified by third-party fact-checkers who will post aggregated pieces alongside the original story that give readers more context about it.

“In addition to seeing which stories are disputed by third-party fact checkers, people want more context to make informed decisions about what they read and share,” Facebook product manager Sara Su wrote in an updated news release.

Users will see links to the suspected false news story and the “related articles” at the same time so users can make “informed decisions about what they read and share,” but the company recognizes that there’s a limit to what can be done. Facebook has found that high-volume posters are more likely to share fake news, and the company has taken out newspaper ads warning voters to beware of suspicious content, and previously released a labeling tool that alerted users to questionable content. But that might not be enough.


In its report announcing a plan to halt government manipulation of the platform, Facebook said the spread of false information won’t disappear as long as there are people who can’t tell the difference.

“In the end, societies will only be able to resist external information operations if all citizens have the necessary media literacy to distinguish true news from misinformation,” the company wrote.

Beyond media literacy, an increasingly important skill in digital landscape, there’s the issue of inherent bias. Research has shown that people are more drawn to unsubstantiated content, particularly if it confirms a preconceived belief — and even more so if that content is perceived as being under attack. The association is so strong that trying to disprove a hoax with facts is almost futile and can cause people to believe falsehoods more fervently. Fake news also thrives in news environments where there are constant updates, so people feel too overwhelmed to verify information on their own. Additionally, sometimes pointing out a news site’s veracity, such as Facebook’s fake news labeling tool, can increase traffic to false stories in some cases.

The good news is that the issue of fake news has increased awareness. Studies have shown students are largely unable to decipher fake and real news, and schools are fighting back. Colleges, high schools and primary schools nationwide are putting more of an emphasis on teaching about fake news so students can better navigate media. California legislators have even gone as far as to introduce a bill mandating media literacy.

So while fake news isn’t going anywhere anytime soon, here’s to hoping that with a little help from tech companies and the education system, it’ll be more manageable in the near future.