Facebook Beheading Policy: Zuckerberg Changes His Mind, Again
C'mon, Facebook, get a grip. After Monday's controversial announcement that Mark Zuckerberg and co were refusing to take down a viral decapitation video, apparently because social-media activism and free speech prevailed, a storm of criticism was unleashed against the site. But Facebook stood its ground — oh wait, JK. Here's what happened: first, Facebook stuck a "Warning: Violent" banner on the video, but late Tuesday, it pulled the video entirely from the site. (This was after Facebook initially took the video down in May, and then put it back onto the site, citing social justice.) The tape sees a masked man beheading a woman, reportedly in Mexico, and prompted a flurry of complaints from Facebook users who wanted it taken down.
Facebook's original statement defending the video, Take One:
Facebook has long been a place where people turn to share their experiences, particularly when they’re connected to controversial events on the ground, such as human rights abuses, acts of terrorism and other violent events. People are sharing this video on Facebook to condemn it. If the video were being celebrated, or the actions in it encouraged, our approach would be different.
In other words, as Bustle reported yesterday, the crux of the issue has been how content is being shared, not what the content is. Previously, Facebook relaxed its ban on breastfeeding photos because the photographs weren't being shared for explicit purposes; the company also changed its policy to ban pro-rape memes because those were being shared, um, indelicately. But when it came to viral decapitation as a form of social activism, British Prime Minister David Cameron was pissed.
Facebook then slapped a "Warning: Graphic Content" banner onto the video, and the issue seemed dealt with. That is until Facebook subsequently decided to take a U-turn yesterday, deciding to take down the video and release a very long statement explaining exactly why.
People turn to Facebook to share their experiences and to raise awareness about issues important to them. Sometimes, those experiences and issues involve graphic content that is of public interest or concern, such as human rights abuses, acts of terrorism, and other violence. When people share this type of graphic content, it is often to condemn it. If it is being shared for sadistic pleasure or to celebrate violence, Facebook removes it.
As part of our effort to combat the glorification of violence on Facebook, we are strengthening the enforcement of our policies.
First, when we review content that is reported to us, we will take a more holistic look at the context surrounding a violent image or video, and will remove content that celebrates violence.
Second, we will consider whether the person posting the content is sharing it responsibly, such as accompanying the video or image with a warning and sharing it with an age-appropriate audience.
Based on these enhanced standards, we have re-examined recent reports of graphic content and have concluded that this content improperly and irresponsibly glorifies violence. For this reason, we have removed it.
Going forward, we ask that people who share graphic content for the purpose of condemning it do so in a responsible manner, carefully selecting their audience and warning them about the nature of the content so they can make an informed choice about it.
From "social activism" to "irresponsibility glorifying violence?" Looks like Facebook's as unsure where to draw the line as the rest of us.