With its wide reach into the lives of people all over the world, social media has the power to change and hopefully improve the way that people deal with mental health crises. On Wednesday, Facebook announced that it has integrated suicide prevention tools into Facebook Live, and made other important upgrades in the way it detects and assists users who may be having suicidal thoughts. Hopefully, these tools will make it easier for people in crisis to find the help they need, and equip other users to intervene when loved ones exhibit signs of severe mental and emotional distress.
The World Health Organization (WHO) reports that almost 800,000 people die by suicide every year; suicide is the second leading cause of death among people between the ages of 15 and 29. In a February letter to the Facebook community, Facebook co-founder and chairman Mark Zuckerberg admitted that Facebook has not always been able to prevent suicides within its platform. (Indeed, there have been multiple, very disturbing instances of people using Facebook to stream their suicides live.) Zuckerberg emphasized the important role that Facebook can play in intervention, writing,
Facebook has had suicide prevention tools in place for a while, but the features launched today expand on those tools in crucial ways. For example, Facebook has now integrated suicide prevention tools into Facebook Live. If someone is watching a live video by a person whom they suspect may be considering suicide, they now have the ability to report the video to Facebook or to reach out directly to the person making the video. The person reporting the video will receive information about how to help their friend, while the person making the video will see a pop up with options to contact a loved one, message a help line, or read tips for dealing with mental health crises.
Today, Facebook also introduced tools that allow users to chat directly with representatives from crisis support organizations like Crisis Text Line, the National Eating Disorder Association, and the National Suicide Prevention Lifeline.
Furthermore, Facebook is developing artificial intelligence to detect users that may be at risk for suicide. Although this technology is still being tested, the goal is to have a system that can automatically flag posts that express thoughts about suicide or self-harm and report them to Facebook, allowing the Facebook team to evaluate such posts and intervene if necessary. That way, people in crisis wouldn’t have to rely on other people reporting them in order to be offered help.
AI-assisted suicide prevention may sound like something out of science fiction, but if it works, it could help people at risk get the support they desperately need. To find out more about Facebook’s suicide prevention tools, visit Facebook’s Help Center.