Life

Facebook's Long Troubled Relationship With Privacy

by Emma Cueto

When the news broke that Facebook conducted a secret social experiment on its users, pretty much everyone was outraged (and scared for what this might mean for the future), but I can't say any of us were particularly shocked. Facebook doesn't have the best track record on privacy concerns, after all. And even though deliberately seeing if it's possible to manipulate people's emotional states (beyond just making them want to stay on Facebook) is definitely crossing a line, it's not like this revelation came out of nowhere.

Unlike a lot of social media sites like Twitter or Tumblr or Instagram that are designed with the idea that interaction with strangers will be the norm, Facebook started as a way to add an online dimension to relationship that exist in real life, and as such, people tend to be particularly upset that the information they intend to share with only family and friends can have a much, much wider audience. When Facebook first introduced the Newsfeed feature in 2006, something most of us now take for granted, people were outraged. But compared to the other privacy concerns, this was nothing.

In 2007, for instance, Facebook launched Beacon, a service that posted users' activity on third-party website to Facebook, essentially letting all your friends see what you were buying. There was no way to opt of the program entirely when it first launched, and the feature was eventually discontinued after massive objections.

Facebook has also run into trouble over much more insidious things like eliminating privacy settings for timeline searches, as well as making it impossible for a person to delete their Facebook account; though users can deactivate, their data remains on the Facebook servers.

And that can be a big deal considering how much data Facebook shares with the government. Though Facebook has continued to fight plenty of government requests for user information, it's evident that they are rarely successful, and it's shockingly easy for the government to get ahold of a person's Facebook history.

But it is perhaps Facebook's love of data mining that raises the most eyebrows. It's well established that Facebook collects huge amounts of data on its users. The ways in which that package that data for advertisers, though, can get pretty unsavory. Plenty of people have alleged that Facebook even uses private messages in its effort to create profiles about a person's online activity, profiles which are then sold to advertisers.

In other words, nothing you post on Facebook is ever safe.

Still, the idea that Facebook might go beyond simply monitoring its users, beyond collecting and selling information about them, beyond even trying to make users want to stay on the website, and instead actually try to manipulate people's emotions – that's pretty unsettling. Though Facebook has already shown a lack of concern for its users by disrespecting their privacy concerns, it's now showing a lack of concern for their emotional state – and their rights to give informed consent about participating in scientific experiments.

Facebook, to be clear, has over 1.28 billion active monthly users. More people use Facebook than belong to the Catholic Church, which is the largest religious denomination in the world. If Facebook is interesting in manipulating the psychological states of its users – not just manipulating them into spending more time on Facebook, but genuinely manipulating their emotions – that has huge potential consequences. Especially when one considers the fact that Facebook and the government have been getting ever closer over the years.

So are we headed into a nightmare world where, as David Holmes suggests over at Pando Daily, Facebook is engaging in psychological warfare or targeting and manipulating people from rival companies? Only time will tell, but things aren't looking good.