Advertisement

Will Facebook’s Efforts Actually Help Prevent Suicide?

CREDIT: AP PHOTO/PAUL SAKUMA
CREDIT: AP PHOTO/PAUL SAKUMA

Preventing suicide is a difficult undertaking because it’s an action that’s carried out swiftly and desperately by those struggling to deal with their mental anguish alone, even if they may lead a seemingly normal life. However, not all cries for help are silent — especially on social media, where you may come across melancholic statuses from friends on your newsfeed.

Now, Facebook wants to capitalize on the confessional nature of its platform. The social media giant is rolling out a new suicide prevention tool — which it created in partnership with a few mental health organizations — that allows users to reach out to their troubled loved ones virtually and connect them with online resources after spotting the first sign of trouble.

While mental health experts believe that the app could better help concerned family and friends spark much needed conversations and connect distressed people with resources, some warn against overly depending on the social media platform for help with these sensitive matters when direct contact may prove more effective.

With the suicide prevention app, users who are concerned about a friend’s post can directly “report” it by contacting their friend, another friend for support, or a suicide prevention hotline. Facebook then examines the reported post to see if warrants intervention. If so, the friend in question will receive a message that gives him or her the option of reaching out to a friend, calling a suicide hotline, or looking over a host of suicide prevention materials, including video messages and relaxation techniques.

Advertisement

“For those who may need help, we have significantly expanded the support and resources that are available to them the next time they log on to Facebook after we review a report of something they’ve posted,” Rob Boyle, Facebook’s product manager, and Nicole Staubli, Facebook’s community operations safety specialist, wrote in a statement posted on the social network last week during its Compassion Research Day, an annual event that’s part of an effort to help users flag potentially harmful material they see online.

As of last week, half of Facebook users worldwide received the suicide prevention app. All of those with a Facebook account will have access within a matter of months. Experts in the field are optimistic about the potential of the new tool.

“I’m really excited about this great news,” Lauren Redding, communications coordinator at Active Minds, a nonprofit organization that raises awareness about the issue among college students, told ThinkProgress. At Active Minds, Redding helps spread information about self-help resources to college students via social media. She said that Facebook’s suicide prevention app represents the successful integration of technology and mental health.

“This app is reaching people where they are,” Redding said. “We didn’t get this kind of mental health education 40 years ago. Members of the general public are getting information delivered to them on the social media platform. Facebook is our biggest tool in getting the message out about suicide prevention. Many mental health organizations and nonprofit are harnessing Facebook’s power to get information out to the masses.”

More than 41,000 Americans commit suicide annually and millions more attempt to do so, according to the Centers for Disease Control and Prevention, making it the tenth leading cause of death domestically. Among young people between the ages of 10 and 24, suicide stands at the second leading cause of the death in the country and the third for college-aged students.

Advertisement

In recent years, social media has been construed as more of a culprit in the suicide epidemic among young people, particularly because of its potential to facilitate cyberbullying. However, people in the mental health and technology fields have increasingly realized that the platform could also serve as a tool in preventing suicides.

Researchers have found that while people who commit suicide aren’t likely to overtly outline their specific plans via social media, they may take to the online platform to describe themselves as a burden on the world and publicly ponder how they should “correct their mistakes.”

Lisa Horowitz, staff scientist and pediatric psychologist at the National Institute of Mental Health, praised Facebook’s latest strategy to meet troubled youngsters on a medium they use often, saying that the suicide prevention app enables friends and family members to step in immediately.

“We need to be thinking innovatively with our techniques and reach to a generation of people who are engaged on social media,” Horowitz told ThinkProgress. “People who are feeling suicidal also feel alienated and if they’re reaching out on social media, then it can be a bridge that helps them obtain really good prevention resources. Sometimes people don’t know what to say to their loved one and this tool gives them some language. It’s really hard to reach out to that you’ve noticed something about them that would require mental health attention.”

Mental health experts say that reaching out to a potentially suicidal person and connecting them with resources often requires family members to be empathetic, even if that means moving at the pace that’s slower than what they had in mind. Help Guide, an online repository of information about mental health issues, warns concerned friends and family against arguing with a troubled person and chastising them for wrangling with the thought of suicide. The quintessential interaction requires that one show no judgment, exhibit patience, reassure the person that everything will be okay, and directly asking the person in question if having suicidal thoughts.

Dr. Dan Reidenberg, executive director of Suicide Awareness Voices of Education, a suicide awareness organization, told ThinkProgress that while social media could never replace face-to-face contact, Facebook’s suicide prevention tool takes on some of the aforementioned methods by asking questions to potentially suicidal people and suggesting that they use resources rather than giving them a prescription. That aspect of the app, he said, places the onus on the troubled Facebook user while putting their friends and family’s minds at ease.

Advertisement

“We can’t overreact when someone is expressing the thoughts of suicide and self-harm,” Reidenberg said. “In some ways, this is that middle point where someone might be concerned and the user will be in the best position to determine with kind of help need. We continue to do all we can to do for them when they are ready for it… This tool helps because it has reminders in it. It helps troubled people do things on their own. When that’s not enough, it gives them the tools to know where to go next.”

Facebook’s suicide prevention app follows an attempt in 2011 to connect users to suicide prevention resources. Other companies have followed suit, including the Durkheim Project, which sought to analyze the speech patterns of potentially suicidal people. Google also stepped into the suicide prevention sphere in 2010 when the search engine launched a feature that directed any queries about suicide to contact information for the National Suicide Prevention Hotline.

But privacy, or a lack thereof, remains a concern among Facebook users. Only five percent of Americans trust the social network to protect their personal information, according to a poll conducted by the Princeton Survey Research Associates International in 2014. In the same poll, more than a quarter of respondents said they think Facebook would violate their privacy. Users may be turned off by the new app if their routine posts are being misidentified as a potential cry for help.

In an email to ThinkProgress, Ursula Whiteside, a researcher who helped launch Facebook’s suicide prevention app, acknowledged that some messages could get misinterpreted as warnings of a suicide attempt, but noted that Facebook will take steps to mitigate that. Its staff will be trained by the National Suicide Prevention Hotline and carefully review the material.

Even if some posts do get misreported, Pamela Rutledge, the director of the Media Psychology Research Center at Fielding Graduate University in Santa Barbara, CA, said that the Facebook suicide prevention app is a step in the right direction. It helps stress the need for people to be vigilant against signs of the trouble in the loved ones, especially if they haven’t spoken to them in some time or live in a different part of the country.

“What we’re seeing here is a shift toward making society more responsible for a citizen’s wellbeing. People are looking at these social networks in a way that when something bad happens, they’re looking at those who probably should have stepped in,” Rutledge said. “If you’re talking about something that could be harmful, then there’s a professional obligation on the part of Facebook to create a system where they can take steps. It’s just like in school; if a teacher as a reason to believe that a child is being abused, then they have a duty to report.”

But Horowitz pointed out that simply reporting the message to Facebook may not suffice. While she called the Facebook’s suicide app an effective tool in helping families overcome their fears about asking a loved one about their troubles, Horowitz said that old fashion face-to-face communication can still help all parties involved, especially the potentially suicidal person.

“If you’re worried that someone’s at risk for suicide, you should ask them directly,” Horowitz said. “People are afraid of asking someone that question because they’re worried about putting ideas in their head but that’s a myth. What you could do instead is ask someone about their plans. If they said they were going to kill themselves, that’s the time to ask the question.”