Advertisement

Facebook Rolls Out Restrictions For Graphic, Violent Video Content

CREDIT: AP PHOTO/PAUL SAKUMA
CREDIT: AP PHOTO/PAUL SAKUMA

Facebook is blocking certain videos from playing on the site automatically, and instead displaying warnings that videos may be graphic and offensive, the BBC reported.

Normally, shared videos users begin playing as users scroll through their Facebook feed. But Facebook’s new policy blacks out the video’s image with overwriting warning consumers the video has graphic content, which “can shock, offend, and upset,” and then asking “Are you sure you want to see this?”

To see the video, users have to show that they’re at least 18 years old, a barrier underage users could circumvent through fake accounts or other methods.

The move comes after criticism of Facebook and other social media sites for freely allowing potentially disturbing content, such as beheading videos or pornography, to invade user feeds — even minors — without warning.

Advertisement

In August, footage of American photojournalist James Foley’s beheading at the hand of Islamic State of Iraq and Greater Syria (ISIS) terrorist group sparked debate over how social media should handle unsettling images or video related to news events. More recently, a video of the shooting of police officer Ahmed Merabet by terrorists in Paris has begun circulating, and will be impacted by the new Facebook policy.

Facebook’s policy has always prohibited and sought to remove content that “glorifies violence” or explicit nudity. Videos flagged as offensive are usually taken down by contract workers overseas or flagged by users and then removed. The site and others like it have struggled to keep up with and curate the millions of videos and images uploaded every day, while ensuring Facebook users can create, see and share news in their community.

Facebook has actively said its platform should promote discussion and awareness, even if the topics are unsavory or violent. But social media sites have historically used an uneven approach to how content should be moderated to best balance people’s right to know and share their world, and filter out the visually controversial.

Facebook has also had trouble determining what content to regulate. For instance, the social network came under fire for refusing to take down images of rape and domestic violence that were labeled as “humor,” even as it disabled accounts that posted photos of breastfeeding.

Videos are the top content driver on Facebook with people uploading videos 94 percent more in the U.S., and 75 percent more worldwide in the last year. As a result, Facebook is looking to cash in on the 1 billion video daily views by recently acquiring video tech startup QuickFire.

Advertisement

The moral and ethical debate over how social media companies should handle graphic content is becoming more mainstream. The Supreme Court is now hearing the case of a user who repeatedly made explicitly violent comments directed at his wife in the form of lyrics or poetry on Facebook. Elonis’ account wasn’t suspended, but he was convicted in 2010 for the threats he made online.