YouTube takes on toxic comments and trolls in new community guidelines

Google’s video platform rolled out a new set of guidelines to bury toxic comments.

CREDIT: AP Photo/Richard Vogel
CREDIT: AP Photo/Richard Vogel

YouTube’s rising and known stars now have a new tool to combat abusive and harassing comments underneath their videos: pinning positive or constructive messages to the top of the page to encourage discussions.

The new set of community guidelines, effective Thursday, allows content creators and channel producers to float selected user comments to the top, while an algorithm sifts through and preemptively eliminates comments that could be abusive.

Users will be able to opt in to the algorithm-based comment filter, which is in the beta testing phase. Video creators and channel owners can review those comments and hold the ones they deem are inappropriate or tangential.

Like many other social media platforms and news sites, YouTube has struggled with online harassment. News organizations, including the Guardian and NPR, have abandoned comment sections in response to the volume of the harassment, often targeting writers on the basis of race or gender. In social media, Twitter has made several recent efforts to curb its harassers, which are numerous enough to have potentially scared off buyers. Reddit, which has had similar abuse problems, implemented anti-harassment and bullying policies last year, although many users have since characterized the move as a form of censorship.

Google’s new YouTube tools are part of the company’s larger push to moderate comments. The company has rolled out several initiatives, including YouTube Community to help content creators better engage with their audiences and give them the ability to automatically block comments that contain preset words and phrases. The video platform also has a rewards program called Heroes, which encourages users who report offensive or abusive comments.