Graphic images and video of slain U.S. photojournalist James Foley, who was kidnapped and beheaded by the Islamic State of Iraq and Greater Syria (ISIS), began circulating news sites and social media feeds almost immediately after the video was released late Tuesday. And as the brutal video quickly spread through cyberspace, social media sites such as Twitter and YouTube started to take them down.
ISIS posted the video on YouTube and another website, AlFurqan Media. YouTube immediately removed the video. Twitter went a step further, suspending accounts and blocking users it found linking to images or the video. Twitter’s CEO Dick Costolo tweeted early Wednesday that the site was “actively suspending accounts” related to the graphic images, linking to a The New York Times post on the matter.
We have been and are actively suspending accounts as we discover them related to this graphic imagery. Thank you https://t.co/jaYQBKVbBF
— dick costolo (@dickc) August 20, 2014
Soon after, the main Twitter accounts used by ISIS, which helped circulate the images and video, were shut down.
But the images keep resurfacing. In snippets, in stills, and sometimes, the whole thing. Pro-ISIS users whose Twitter accounts were shutdown popped up under new names. ISIS took the video to other outlets, such as Diaspora and LiveLeak, though both outlets have since banned the video.
Social media platforms, like Twitter and Facebook, have the right to filter content that violates their established policies, and they can choose how to enforce those rules. But the nature of the open Internet makes it harder to draw a line on what should be shared and what shouldn’t. And social media organizations, which increasingly decide what gets shared and whom can share it complicates how free speech is valued in the digital world.
Lack Of Consistency
“I think ten years ago, most news organizations would have opted to not show the video,” Jillian York, the Electronic Frontier Foundation’s Berlin-based director for international freedom of expression, told ThinkProgress.
YouTube, which is owned by Google, has exceptions, and allows violent content for information that is educational or documentary purpose, said York, who focuses on free expression and privacy in the Middle East.
“All of these platforms have rules that ban violent content. Personally, I don’t have an issue with them taking the video down. But there is a problem with consistency.”
YouTube immediately took down ISIS’ video of Foley’s execution, and Twitter responded by disabling accounts of users who were sharing it. But that discretion, to remove content and punish users who promote it, isn’t evenhandedly applied.
There are videos of kids being bullied, of neighborhood fights. There’s footage of people being killed by the police, such as New York’s Eric Garner who was killed in July by a police officer using the choke hold to subdue him. Viewers can clearly see an alive Garner gasp for breath and, then, slip away.
Similar to Foley, there are dozens of videos of similar executions featuring Palestinians and Syrians, some of them children, that haven’t gotten the same amount of attention as Foley’s beheading video. A simple YouTube or Google search of “beheading video” brings up a string of killings from all over the world. And the reason for that isn’t automatically because of race, ethnicity or nationality, York said.
For most of the victims in Syria and in Iraq, “there’s no point of contact,” no relative or loved one to call Google or Twitter to ask the footage to be taken down, York said. So if companies are going to say “no violent content” they need to be consistent and take all violent videos down even if there’s no family to ask for privacy and for the image to be removed, York said.
“The companies should treat speech as the media treats speech: as an ethical question,” York said. “Whether or not Twitter should take down news accounts [for posting or linking to the video], obviously not,” she said. But in a world where everyone is the media, “whose account gets to get taken down?”
Twitter announced Wednesday that it wouldn’t suspend the New York Post’s or Daily News’ accounts over their images of Foley, contradicting Costolo’s earlier vow to ban accounts relaying the gruesome images.
“I don’t’ think private corporations should regulate speech,” York said. “If they are going to regulate speech — which they already do — they need to be consistent and transparent about that.”
‘All Content Is Not Created Equal’
Ken Paulson, president of the First Amendment Center and dean of Middle Tennessee State University’s College of Mass Communication in Nashville, told ThinkProgress, “The First Amendment says the government may not restrict our speech, but it has nothing to do with social media companies.” Social media networks have the right to permit or restrict content as they see fit. But how they manage free speech raises ethical questions such as what kind of content should be blocked.
Social media sites tend to have a laissez faire approach: letting people share and say what they want no matter how offensive or threatening. But, sometimes, they step in and make editorial decisions — just like a news organization — on what behavior or content is inappropriate.
The availability of and access to information today, especially how it’s shared through social media, is not only certainly different than it was a decade ago, but changes the power structure of who controls what is said and who gets to say it.
“It sounds quaint right now, but there was a time when people would call newspaper offices and threaten to cancel their subscriptions [if offensive content was published]. When people paid for content, editors were more cautious,” said Paulson who was formerly editor-in-chief of USA Today. “When content is tied to clicks or traffic to a site, there’s more temptation [to publish controversial content].”
While the First Amendment gives media the right to publish, “it also gives us the right not to publish,” Paulson said.
“There have always been some publications that adhere to different standards than the rest,” Paulson pointed out. For example, the New York Post, which is known for its racy headlines and cover photos, as well as the Daily News, ran controversial cover photos depicting Foley with a knife to his neckfor their Wednesday issues. The Post cover garnered massive criticism from Twitter users who deemed the choice to publish the image as insensitive, irresponsible, and tasteless.
“The difference now, is that everyone has a potential worldwide audience,” Paulson said. With that, some people believe that being able to publish anything and everything is “the logical extension of free speech…But I do think that the true value of free speech stems in part from criteria of accuracy, balance and taste. All content is not created equal.”
News editors, and now social media companies, have to make judgment calls when it comes to potentially offensive content, such as a gory photo of a fatal car crash, Paulson said. You have to weigh several factors,” like what the public expects and the source of a video or an image. In the age of digital manipulation, it’s hard to definitively authenticate anything put online — including the Foley video, Paulson said.
“There are sometimes images that can convey what words can’t,” he said, citing a photo of a fireman carrying child out of a building after the Oklahoma City bombing in 1995. “There’s info in that photo that mere words can’t convey.”
That’s not necessarily the case with viral images of Foley being executed.
Where News Breaks First
While there’s no algorithm or filter technology seeking out violent content, Facebook tries to weed out posts that “glorify violence.” That means if someone shares the video of Foley’s beheading, Facebook won’t take it down if it’s condemning it or sharing it for newsworthiness.
Facebook’s policy aims to promote awareness discussion, even if the issues are violent. “Facebook has long been a place where people turn to share their experiences and raise awareness about issues important to them. Sometimes, those experiences and issues involve graphic content that is of public interest or concern, such as human rights abuses or acts of terrorism,” Facebook states in its Community Standards policy.
Twitter did not return ThinkProgress’ requests for comment on how the site decides which violent or graphic content, and users who share it, should be blocked. But the site recently implemented a new image policy that allows family members or authorized persons to request images of deceased be taken down from the site. The new policy was developed after Zelda Williams, the late Robin Williams’ daughter, vowed to quit using the site because of the disturbing images of her father other users sent her. Robin Williams committed suicide in August.
Social media has become a place where news breaks first, where everyday people can report events before the traditional media. And the likes of Facebook thrive on people freely sharing what’s going on in their communities — even if it’s offensive.
Facebook’s application of its policy is admittedly not perfect, the spokesperson said. Choosing what content to allow or block is a tough balance between someone’s interest and desire to share content he or she feels is important and another person who wants to be shielded from graphic content. But like with other social media networks working to keep their platforms harmonious, it’s a work in progress.
For example, graphic images of the aftermath of the Boston Marathon bombing in 2013 showed victims with broken or missing limbs, and protruding bones. Sharing those images served a purpose, let people on the scene report in real time what was happening. But banning some content over others can prove problematic. Those images were permitted because of their newsworthiness and ability to describe horrific events happening in real time.
“What it comes down to is whether companies want to be platforms for free speech,” York said. “Censorship is futile. In taking it down, fewer people might stumble on it by accident but they can’t get rid of it [entirely] no matter what they do.”
While “it’s a little too late to turn around and say ‘hey we’re only a family friendly site,’” social media companies, and consumers, have to decide what their role is when it comes to free speech. So far, social media companies have been reluctant to come out and say “this is our purpose and this is what we’re for,” York said. Facebook’s chief operating officer, Sheryl Sandberg was proud of the site’s role in the Arab Spring, which yielded its share of graphic images.
“Today, every photo and every idea can be shared widely, even if there weren’t Constitutional protections,” Paulson said. “Even if every major media organization decided that an image was too graphic to share, there’s going to be someone in their bedroom in Connecticut ready to disseminate [it].”
As it is now, every site, news and social, is deciding what content best serves the public — even if it’s graphic at times. “There is some violent content that does serve a purpose,” York said. But the question remains, “If the media is hosting it, why can’t Twitter?”