Facebook is taking a hard stance against the weaponized use of fake news.
In a security report the social network released Thursday, the company named “information operations” or strategic actions governments and individuals use to distort public opinion as one of the platforms biggest challenges.
Unlike clicks-for-pay scams and hacks, information operations, which can amplify false news through fake accounts, are complex and not easily deterred. But Facebook said it will use machine learning and artificial intelligence to better identify and eliminate fake accounts that amplify false stories for political influence.
The 2016 presidential election, which saw multiple controversies around fake news and politically motivated hacks, was the impetus for the cybersecurity report and revealed evidence that larger disinformation campaigns were in play.
“During the 2016 U.S. presidential election season, we responded to several situations that we assessed to fit the pattern of information operations,” the report states. “We have no evidence of any Facebook accounts being compromised as part of this activity, but, nonetheless, we detected and monitored these efforts in order to protect the authentic connections that define our platform.”
Facebook didn’t say which state actors were linked with the information operations during the election, but said its “data does not contradict” the U.S. Director of National Intelligence report issued in January. The DNI named Russian President Vladimir Putin as behind the influence campaign targeting the election.
Facebook’s report follows its suspending more than 30,000 accounts earlier this month ahead of the first round of voting in France’s presidential election, and months of scrutiny and criticism of the company’s slow response to the fake news phenomenon.
Since the U.S. election’s conclusion, Facebook has released several initiatives and tools aimed at helping users identify reputable news stories and sources, but has called information operations “insidious” because it “ obscures and impairs” people’s ability to have genuine conversations.
Ultimately, the report concluded that while Facebook will do everything it can to stop information operations, it’s up to society as a whole to protect itself from falsehoods online.
“In the end, societies will only be able to resist external information operations if all citizens have the necessary media literacy to distinguish true news from misinformation,” amplified through leaked or stolen data and fake news, Facebook wrote.