A new policy slated to go into place next week will prohibit Facebook users from sharing white nationalist content, as well as white separatist content, the social media company said in an announcement on Wednesday. In a statement, Facebook officials said that those "concepts are deeply linked to organized hate group sand have no place on" on their website.
"Unfortunately, there will always be people who try to game our systems to spread hate," the company said. "Our challenge is to stay ahead by continuing to improve our technologies, evolve our policies and work with experts who can bolster our own efforts."
Last year, a Motherboard report revealed that while posts promoting and condoning "white supremacy" were explicitly banned from the social media website, white nationalist and white separatist content was explicitly allowed. Facebook said on Wednesday that it "didn’t originally apply the same rationale to expressions of white nationalism and separatism because we were thinking about broader concepts of nationalism and separatism – things like American pride and Basque separatism, which are an important part of people’s identity."
However, critics and civil rights groups have argued that there is no meaningful difference between the two terms, according to Motherboard.
“There is no defensible distinction that can be drawn between white supremacy, white nationalism or white separatism in society today,” president and executive director of the Lawyers’ Committee for Civil Rights Under Law, Kristen Clarke, told The Washington Post on Wednesday.
The social media network will not only ban such content moving forward, but will also implement an intervention system. Specifically, Facebook said, users who search for terms associated with white supremacy will be directed to the group Life After Hate, an organization that helps people leave hate groups. (According to its website, the group is "dedicated to inspiring individuals to a place of compassion and forgiveness, for themselves and for all people.")
Social media companies have faced increased scrutiny in recent years as white supremacists and other hate groups have used their websites to organize and express themselves, USA Today reported. Focus on these trends intensified after a fatal neo-Nazi rally took place in Charlottesville, Virginia, in 2017, according to the Post.
That scrutiny was renewed in recent weeks, after a gunman killed 50 people in two New Zealand mosques, according to NBC News. The attack was live-streamed on Facebook, per the news outlet.
The racial justice organization Color of Change, which advocates for social media companies to take stands against hateful rhetoric posted on their websites, said in a series of tweets that they were glad white supremacy and white nationalism would be treated the same on Facebook.
"We look forward to continuing our work with @Facebook as they take this critical step forward to ensure that the platform’s content moderation guidelines and trainings properly support the updated policy and are informed by civil rights and racial justice organizations," the organization said.
Facebook said in its statements that the new policy would apply to content shared on Instagram, as well. "We’re making progress, but we know we have a lot more work to do," the company said.