Facebook Algorithm Is Less Influential In What We Click On That You'd Think, Says New Study — But Can We Trust That?
Most of us spend way more time than we need to on Facebook, which is why it's disturbing when we find out the site might be manipulating us. But it turns out the Facebook algorithm is way less influential in determining what users click on than we might think it is. At least, according to researchers employed by Facebook.
According to a study published in Science Express on Thursday titled “Exposure to ideologically diverse news and opinion on Facebook,” what people click on has a lot more to do with personal choice, rather than Facebook's algorithm. The study, which used anonymized data from over 10 million Facebook users, looked specifically at political links posted to the site. Researchers looked at what kinds of news stories users posted, noting whether it was generally liberal or conservative, and compared it to the kind of stories they clicked on.
And the results? Well, basically they found that regardless of how the algorithm ranked the news stories, people still didn't click on links that ran contrary to their existing political opinions. As the study explains, "on average in the context of Facebook, individual choices more than algorithms limit exposure to attitude-challenging content."
So basically it does not matter how much Facebook diversifies the different opinions on your news feed, you are never going to click on the Bill O'Reilly clips your aunt posts.
In some ways, this is kind of sad. After all, social media gives people from all walks of life a chance to interact and communicate and share perspectives in ways never possible before. Plus, on Facebook, where it's much harder to be anonymous, there's less chance of someone trolling you for your opinions. And yet it seems that we aren't using the platform to interact with new viewpoint so much as to pretend those viewpoints don't exist while engaging avidly with people who agree with us. Not our hour, though not exactly surprising.
It is more encouraging as far as what it means for Facebook's potential to exert mind control over you one day. This shows that Facebook's algorithm might be able to change what you see on your News Feed, but it can't make you click on things you aren't interested in, implying that we aren't as easily manipulated as some might fear.
On the other hand, it's worth pointing out that people tend to be more rigid about politics than they are about other things. The fact that Facebook can't get us to go against our political biases doesn't really rule out their ability to direct us to other stories, or otherwise influence our behavior or mental states. After all, Facebook use has been shown to affect everything from your body image to your mental health. And as we have seen in the past, the social media site is willing to not just observe our behavior but actively try to influence it.
Essentially, while it's interesting to know Facebook claims they can't influence what articles we click on, that doesn't mean we should trust whatever other studies they may be running on us.