Oh boy. Thanks to the watchful eye of New Scientist we can now read the results of that experiment Facebook asked us all to participate in. Wait, you didn’t give your permission to have your News Feed manipulated for a science experiment? As many can attest to, you certainly did. By signing up for Facebook and agreeing to the Terms of Service, you’ll find that it is perfectly within the bounds of their Data Use Policy to manipulate News Feed data.
So we have that going for us. But what did Facebook actually do and why? Well, essentially Facebook wanted to test out an extension of the theory that we are affected by the emotions of those around us. For this experiment, Facebook researchers sought out to determine if we are similarly affected by the emotions on the internet, namely positive or negative News Feed posts.
Facebook tested this hypothesis by curating 689,000 users’ News Feeds for a week using an algorithm designed to analyze words with positive and negative connotations. Some users had the positive posts of their friends suppressed, while others had the negative posts suppressed.
Thankfully, the study produced some results, as Facebook confirmed that users who had negative posts suppressed for the week were more likely to post positively by the end of it, and vice versa. Who knew?
Okay, this isn’t groundbreaking research, but there are obviously implications here. First off, while many are aware the possible loss in privacy when using a Facebook account, actually reading about the manipulation of personal News Feeds is slightly disturbing. Does anyone else feel a little violated? As far as the research goes, it’s certainly not surprising, but as Animal New York points out, this could be a helpful tool in demonstrating how even words posted on the internet can affect someone.
If you’d like to read the full report, follow the link. How does everyone else feel about being subjected to Facebook experiments?