Life

Facebook Just Changed Its Search Function To Avoid Triggering Users

Carl Court/Getty Images News/Getty Images

People are increasingly using social media to find community support for their mental health struggle. As a result, platforms like Facebook are trying to find a balance between promoting community formation and preventing people from being inadvertently harmed by intense mental health content on social media. To do this, Facebook is changing its search policies regarding self-injury and suicide.

In an effort to protect "those who are most vulnerable" to mental health struggles, Facebook's Newsroom statement says that the company will "no longer allow graphic cutting images to avoid unintentionally promoting or triggering self-harm, even when someone is seeking support or expressing themselves to aid their recovery." On Instagram, too, the company will make it more difficult to search for content regarding self-injury and will prevent it from being recommended to users. Sensitivity screens will also be placed over photos featuring self-harm cuts, even when they have healed, "to help avoid unintentionally promoting self-harm."

And, for any users who post content "promoting eating disorders or self-harm," Facebook says it will continue to send resources, even if the user's post has been flagged and removed. To assist the company with these efforts, Facebook is also looking to hire a health and well-being expert that will be able to enhance the platform's ability to support user mental health.

Alexander Koerner/Getty Images News/Getty Images

But Facebook doesn't simply want to take down content people might be posting or searching for in order to seek help, affirmation, or community. According to Facebook's Newsroom statement, the platform includes Orygen's #chatsafe guidelines in the Facebook Safety Center to educate users about how to most effectively and safely communicate about suicide-related content online.

Orygen, The National Centre of Excellence in Youth Mental Health, created their #chatsafe guidelines in collaboration with young people who find it important and validating to talk openly about experiences with suicidal thoughts online. By encouraging young people to think about why they want to post or reply to content about suicide online, #chatsafe guidelines discourage using stigmatized words and invite young people to consider flagging their own posts with content warnings when applicable.

And according to Facebook Newsroom's statement, it has been working hard behind the scenes to promote the kinds of conversations that can help, rather than inadvertently harm, people's recovery processes. For example, the platform has removed or added sensitivity screens to over 1.5 million posts with self-injury and suicide-related content between April and June of 2019 alone.

Justin Sullivan/Getty Images News/Getty Images

The fact that there have been so many Facebook posts in such a brief period of time regarding self-injury and suicide is a strong indicator that social media is a crucial outlet for mental health conversations. According to a 2016 study published in the journal Epidemeology and Psychiatric Sciences, sharing details of personal mental health journeys on social media with online peers both challenges mental health stigma and increases people's sense of social belonging and community. And a 2018 study published in the journal Biomedical Informatics Insights concluded that social media posts about mental health can enhance people's empathetic responses to other people's experiences with mental health and violence IRL.

But these is a flip side to people's online exposure to others' experiences with self-injury and suicide. According to a 2018 study published in the Indian Journal of Psychiatry, while young people who self-harm are more likely than those who don't to seek support from others online, this also increases their exposure to "negative messages promoting self-harm, emulating self-injurious behavior of others, and adopting self-harm practices from shared videos."

Looking to the future, according to its Newsroom statement, Facebook wants to "share public data from our platform on how people talk about suicide, beginning with providing academic researchers with access to the social media monitoring tool, CrowdTangle" with the aim of using your posts to recognize warning signs of when you might need help. Because you deserve the community support you need and the content warnings and sensitivity screens that might make those communities most effective for you.