News

Should That Creepy Facebook Study Be Investigated?

by L. Turner

Remember that creepy emotional experiment Facebook did? A privacy group, EPIC, wants Facebook federally investigated for it. EPIC, which bizarrely stands for the Electronic Privacy Information Center, says Facebook "purposefully messed with people’s minds" in a complaint it lodged with the Federal Trade Commission on Thursday. It called Facebook's move a "deceptive trade practice" and asked the FTC to impose sanctions on the company.

They also slipped this little demand in there: They want Facebook to make its News Feed algorithm public.

EPIC makes most of its demands based on its contention that at the time of the creepy experiment, in which the company used mostly positive or negative News Feed posts to manipulate roughly 700,000 users' emotions just to see what happened, the company's terms of service didn't say anything about using data for research.

The complaint alleges that when the experiment occurred, the September 2011 Data Use Policy was in effect. Then Facebook changed it, according to EPIC:

In May 2012, four months after the research at issue was conducted, Facebook made changes to its Data Use Policy. These changes included adding “internal operations, including troubleshooting, data analysis, testing, research and service improvement” to the list of things for which Facebook may use information it receives from users.
Justin Sullivan/Getty Images News/Getty Images

Hmm.

Facebook says that doesn't matter, because the company's actions fell under a clause that allowed it to use its data "to enhance the services we offer." Because everybody wants to be part of the world's biggest emotionally manipulative social network! Jodi Seth, a Facebook spokeswoman, said on Thursday that the exact wording didn't matter, according to The New York Times.

When someone signs up for Facebook, we’ve always asked permission to use their information to provide and enhance the services we offer. To suggest we conducted any corporate research without permission is complete fiction. Companies that want to improve their services use the information their customers provide, whether their privacy policy uses the word ‘research’ or not.
Justin Sullivan/Getty Images News/Getty Images

Calling the research "corporate" is a bit of a dodge, though. We all found out about the research because it was used in an academic paper released in June. Clearly the "study" the company performed has implications far beyond Facebook's service. Still, Facebook scientist Adam Kramer, who Forbes reports helped ran the study, defended it in — what else? — a Facebook post this week.

The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook.
Chris Jackson/Getty Images News/Getty Images

In a way, Kramer is defending what happened by implicitly suggesting the company was only looking into the effect of negative posts to see whether Facebook should limit those posts' "reach" to other users, so more people stay happy. That suggests Facebook has the power to artificially change your world to make you happier even when your friends aren't. Which is pretty Brave New World, and pretty unacceptable.

The outrage from Facebook's little experiment — and EPIC's filing with the FTC — should send the company a clear message: Don't mess around with mood manipulation. But it sounds like people at Facebook aren't quite getting that.