News

Facebook Launches Suicide Prevention Tool

by Kim Lyons

At a conference on Wednesday, social networking giant Facebook announced plans to help those considering suicide, launching a new strategy to expand its platform's support tools, and enlist more help from other Facebook users. The updates to its policies, which Facebook announced at its “compassion research day,” were put together with guidance from mental health organizations like Forefront, Now Matters Now, Save.org and the National Suicide Prevention Lifeline.

Facebook's department of safety outlined the plans in a detailed post. In addition to taking steps to actively engage a person who may be at risk by locking their account when one of its monitors sees potentially suicidal language, Facebook says it will provide options for a friend to call or message the distressed person. A suicide support professional will also be made available.

What Facebook heard from the suicide prevention agencies it consulted was that connecting people who are in distress with people who care about them is crucial, according to Rob Boyle, Facebook product manager, and Nicole Staubli, Facebook community operations specialist, the authors of the post. Facebook also asks for users to be aware, and to contact Facebook if they see posts that might demonstrate one of their friends is in trouble.

If someone on Facebook sees a direct threat of suicide, we ask that they contact their local emergency services immediately.

We also ask them to report any troubling content to us. We have teams working around the world, 24/7, who review any report that comes in. They prioritize the most serious reports, like self-injury and send help and resources to those in distress.

The San Francisco Chronicle explained the process like this: If Facebook's staff of monitors considers the wording in a post to have a suicidal message, that user's page will be locked and their access restricted. They wont be able to log back into Facebook until after viewing messages like the ones pictured, which contain links to suicide prevention resources. So far, the tools are only available to Facebook users in the U.S.

While this seems like a step in the right direction, privacy experts expressed concern, The Chronicle reported. One expert said Facebook is trying to "practice psychiatry," which is, obviously, not a great idea. But Facebook said 25 percent of those who have been flagged by its new practice so far chose to contact someone, and that 30 percent of those people reached out to a suicide prevention hotline, according to The Chronicle. So it's tracking the data, to some degree.

The concept of getting a little too into users' mental states is one that has backfired on Facebook in the past. Of course, for a week in 2012, according to a study published in the Proceedings of the National Academy of Sciences, Facebook manipulated some 700,000 users' feeds to see how they responded to an abundance of happy posts versus a slew of negative or sad posts. The company defended the practice, of course, saying it fell within the permissions for using its participants' data, but pretty much everybody was outraged.

It's not clear what information, if anything, Facebook would retain from sessions where someone clicks a link from one of these suicide prevention resources. But given its track record with conducting research and mining information without users' knowledge (despite the implicit permissions), it's a question worth raising.

Overall, though, people hope the new strategy will help at least a few people who might be considering hurting themselves or taking their own lives. Jennifer Stuber, the faculty director at Forefront, one of the groups Facebook consulted when it put together the new strategy, told The Chronicle she was wary at first, but was impressed with the effort the company has put into the program. She said:

There is tremendous potential and I’m excited Facebook is taking this on.

Images: Facebook Safety (2)