Advertisement

UK moves to punish social media companies for extreme, violent content

“These are among the biggest, richest and cleverest companies in the world, and their services have become a crucial part of people’s lives. This isn’t beyond them to solve.”

People view floral tributes to victims of Wednesday’s attack outside the Houses of Parliament in London, Friday, March 24. Authorities identified a 52-year-old Briton as the man who mowed down pedestrians and stabbed a policeman to death outside Parliament, saying he had a long criminal record and once was investigated for extremism — but was not currently on a terrorism watch list. CREDIT: AP Photo/Tim Ireland
People view floral tributes to victims of Wednesday’s attack outside the Houses of Parliament in London, Friday, March 24. Authorities identified a 52-year-old Briton as the man who mowed down pedestrians and stabbed a policeman to death outside Parliament, saying he had a long criminal record and once was investigated for extremism — but was not currently on a terrorism watch list. CREDIT: AP Photo/Tim Ireland

Social media companies such as YouTube, Facebook, and Twitter need to be punished for hosting violent and extremist content on their platforms. At least that’s what members of Parliament concluded after a yearlong investigation, according to a report from the UK’s Commons home affairs committee.

“They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse. Yet repeatedly they have failed to do so. It is shameful,” said Yvette Cooper, the Labour MP chairing the committee in a statement Monday.

“These are among the biggest, richest and cleverest companies in the world, and their services have become a crucial part of people’s lives. This isn’t beyond them to solve.”

The report comes a year after the stabbing and shooting death of Labour MP Jo Cox in 2016. Her death became the impetus for the investigation after law enforcement found evidence on a computer belonging to 54-year-old Thomas Mair that he had visited white supremacist sites and researched far-right material days before his attack. He is now serving a life sentence.

Advertisement

“Social media companies currently face almost no penalties for failing to remove illegal content,” the MPs wrote in their report. “We recommend that the government consult on a system of escalating sanctions, to include meaningful fines for social media companies which fail to remove illegal content within a strict time frame.”

The committee is pushing for the government to take up the question of whether or not it’s criminal for tech companies to leave up illegal content, including material that promotes harassment, violence or extremist views against a particular group. If turned into law, companies could face fines and penalties for not acting swiftly.

According to the report, the committee found “repeated examples” of anti-Semitic attacks, “terror recruitment videos,” and child abuse remaining online after being reported.

“It has been far too easy to find examples of illegal content from proscribed organisations — like National Action or jihadist groups — left online,” Cooper said. “They have been far too slow in dealing with complaints from their users — and it is blindingly obvious that they have a responsibility to proactively search their platforms for illegal content, particularly when it comes to terrorist organisations.”

Tech companies have been under intensifying scrutiny for how they deal with abuse, violence and extremism on their platforms. Facebook in particular is under fire for a spate of videos where users have filmed killings and other crimes.

Advertisement

In response to the MP’s report, Google, which owns YouTube, said it has “no interest” in making money from extremist content and would expand its program that allows users to flag extremist propaganda, and also build out its alert procedures, the Guardian reported.

For its part, Facebook says it is reviewing its moderation policies, and CEO Mark Zuckerberg recently took responsibility for a video of a Cleveland man murdering an elderly man on Easter Sunday which stayed up on the site for two hours after being reported.

“We have a responsibility to continue to get better at making sure we are not a tool for spreading,” Zuckerberg told USA Today. “Those are all against our community standards. They don’t belong there.”