A California man was arrested and put in a psychiatric hospital after posting a fake suicide post on Facebook, the BBC reported.
The social network rolled out a suicide prevention program earlier this month that lets users report posts that threaten suicide. But 48-year-old Shane Tusch of San Mateo, California set out to test whether the new policy actually worked, and threatened to hang himself from the Golden Gate Bridge.
Another user saw the post and reported it to local police. Tusch was then arrested and held in psychiatric hospital for 72 hours under suicide watch. Facebook also locked his account.
Facebook has been working to improve its suicide prevention efforts for several years. In 2011, the social network introduced a feature that linked suicidal users with counselors. The latest prevention program goes a bit further with an app that lets users who spot friends and loved ones in trouble links them directly, to another friend, suicide hotline, or connects the potentially suicidal friend with other prevention materials such as relaxation techniques and video messages.
Facebook also looks at the flagged post to see if more it calls for more serious intervention by contacting the person posting suicidal thoughts directly and offers means to help. Facebook will also call local police to check in on them for imminent threats.
“For those who may need help, we have significantly expanded the support and resources that are available to them the next time they log on to Facebook after we review a report of something they’ve posted,” Rob Boyle, Facebook’s product manager, and Nicole Staubli, Facebook’s community operations safety specialist, wrote in statement posted.
Tusch, who is a married father of two, said his experience in the hospital was inhumane. He was denied “any humane care” and had to undergo medical tests all because Facebook overstepped their role.
“Facebook needs to leave suicide prevention to family and friends,” he said via Facebook, indicating the person who called the police was essentially a stranger. “There are no checks and balances! I was only proving a point that Facebook should not be involved in this.”
A consumer advocacy group, Consumer Watchdog, has petitioned Facebook to suspend the new program.
Mental health advocates praised Facebook’s new program as another avenue people can take when they experience depressing or suicidal thoughts. Suicide is the tenth leading cause of death with millions of Americans attempting suicide every year, and another 41,000 who end up taking their own lives. For young people, it’s the second leading cause of death in the U.S. between the ages of 10 and 24.
In recent years, social media has become known as a source of suicides because of a rise in cyberbullying. But researchers have found social media could be a saving grace because it also serves as a prime outlet for people to emotionally unload and outwardly describe themselves as insignificant or burdensome on public forums. Facebook’s program isn’t a panacea or replacement for clinical help, mental health experts say, but using social media can be a way for people to provide emotional support and show empathy to those who are struggling.
Facebook has been looking toward stronger policies affecting content users post and share on the platform. The social network rolled out new guidelines earlier this week alongside its transparency report that outline what content is allowed on the site and what might get blocked. No changes were made to the site’s policies, but the guidelines serve to better illustrate the types of posts and images may cause Facebook to intervene.