Facebook Is #SorryNotSorry About Experimenting On You, Now And Forever More

Who knew that selfie-addicts and compulsive status-updaters could be such a valuable resource? Facebook did, and that’s exactly why it plans to continue some version of Facebook's now infamous emotional experiments on Facebook users everywhere. Except this time, they’re planning to be a little stealthier about their emotional manipulation.

If you’re not familiar with Facebook’s ill-famed emotional contagion study from this past summer, here’s how it went: In 2012, company researchers altered the News Feeds of around 700,000 Facebook users to test whether seeing positive or negative updates could affect the way users expressed themselves on Facebook. In other words, they were looking to discover whether a form of “emotional contagion” could really exist in online social spaces.

When the study was published this past July, a general outcry to the tune of “lack of informed consent” erupted from Facebook users everywhere who did not appreciate being used as a guinea pig for their beloved social media site.

It took Facebook a while to get around to addressing their users complaints, but they eventually realized the err of their ways. Well, sort of. On Thursday, Facebook’s Chief Technology Officer Mike Schroepfer wrote a blog post on Facebook Newsroom acknowledging where this summer’s emotional contagion experiment went wrong, and promising some "changes" to how future studies are conducted.

What will these changes look like exactly? Well, Facebook might be sorry, but it doesn’t plan to stop dipping into its most valuable resource: its incredible 1.3 billion-unit sample set. Rather than taking steps to ensure its social media base that their emotions won't be manipulated in the future, however, Facebook has merely taken steps to “change the way they do research.”

Justin Sullivan/Getty Images News/Getty Images

According to Schroepfer, Facebook has instated a new framework for internal and published research that focuses on clearer guidelines and enhanced review processes conducted by more highly trained specialists. So basically, they’re going to keep experimenting on their users, only better, and more efficiently.

The most concerning thing about the whole Facebook-privacy-invasion ordeal isn’t that Facebook has the power to manipulate our News Feeds or even that researchers are examining our online behavior. No, the scariest part about all of this is that the only way to avoid the scrutiny of Facebook’s finest minds is to get remove yourself from the glorious world of Facebook and deactivate your account. And really, who could think of doing such a thing?

Facebook on YouTube

The sad truth is that, as much as you might complain about lack of privacy and breach of security, you would likely rather be scrolling through emotionally-directed News Feeds than be relegated to the social media abyss. And Facebook researchers know that.

Images: Getty Images (1)