*Sad Trombone Sound* - Source: Wikipedia

Facebook experiments, you’re the lab rat


Oh boy. Thanks to the watchful eye of New Scientist we can now read the results of that experiment Facebook asked us all to participate in. Wait, you didn’t give your permission to have your News Feed manipulated for a science experiment? As many can attest to, you certainly did. By signing up for Facebook and agreeing to the Terms of Service, you’ll find that it is perfectly within the bounds of their Data Use Policy to manipulate News Feed data.

So we have that going for us. But what did Facebook actually do and why? Well, essentially Facebook wanted to test out an extension of the theory that we are affected by the emotions of those around us. For this experiment, Facebook researchers sought out to determine if we are similarly affected by the emotions on the internet, namely positive or negative News Feed posts.

Facebook tested this hypothesis by curating 689,000 users’ News Feeds for a week using an algorithm designed to analyze words with positive and negative connotations. Some users had the positive posts of their friends suppressed, while others had the negative posts suppressed.

Thankfully, the study produced some results, as Facebook confirmed that users who had negative posts suppressed for the week were more likely to post positively by the end of it, and vice versa. Who knew?

Wait, you didn’t give your permission to have your News Feed manipulated for a science experiment?

Okay, this isn’t groundbreaking research, but there are obviously implications here. First off, while many are aware the possible loss in privacy when using a Facebook account, actually reading about the manipulation of personal News Feeds is slightly disturbing. Does anyone else feel a little violated? As far as the research goes, it’s certainly not surprising, but as Animal New York points out, this could be a helpful tool in demonstrating how even words posted on the internet can affect someone.

If you’d like to read the full report, follow the link. How does everyone else feel about being subjected to Facebook experiments?

Tags: Facebook Featured Popular Privacy Science

  • Jacob Long

    As an academic, I’m all for it. I wish they’d open up more to outside researchers. While feeling “manipulated” is one thing, it’s important that folks take note that no humans were reading their news feeds. An algorithm was created for the purpose of keeping human noses out of private/semi-private communication.

    While this is academic in nature, businesses do experiments all the time to see how customers respond. Trying two different ads/page layouts/phrasing of content and comparing user response is a pretty common internal research strategy.

    The key with all things like this is that the company is technically competent enough to not accidentally spill user data to the wrong people, but that’s an inherent risk everywhere we go on the ‘net.

    • ryan

      As a “former” academic, I also support this. I’ve done human subject research since 2007 and not disclosing that an experiment is being done (or the actual goal of the experiment) has yielded very useful results. It has in the past been extremely controversial when you aren’t disclosing the true objectives or purpose of an experiment, but research in most scenarios in academia and business are regulated by a review board prior to the start of testing.

      To be fair, we’re all lab rats all the time, whether online or in real life. Industrial psychology has sort of made it trendy to constantly observe trends :)

    • pr

      I kind of feel like this is inherently different because they’re manipulating their own users’ individual content.

      A news organization selectively omitting certain content is a whole different animal compared to omitting user content without their knowledge. It also begs the question, what are the boundaries here? Do we know? Does Facebook?

      The scientist in charge of this said he himself had second thoughts about the study.

      • Jacob Long

        One thing I did not realize initially about the study is that they found an incredibly minute effect. They address this finding in the paper, saying in part that it is because they didn’t want to over-manipulate the subjects.

        You know, they also manipulate the News Feed for a lot of other reasons. As a person who runs a Facebook Page, I’ve learned the hard way what some of those purposes are – to keep you out of touch with your followers if you aren’t willing to buy ad placement.

        • pr

          Interesting, I think this study is fascinating in that it has provoked such a strong response from both sides. So despite my feelings, it’s clearly not black-and-white. Methinks a more extensive post concerning privacy is in order.

  • Jacob Long

    Out of curiosity – do you think it’s worse when they do this or when they tinker the News Feed for other reasons, like deciding what they think is and is not relevant to your interests? They constantly tweak the algorithms for a variety of reasons and I am sincerely curious as to which reasons people dislike the most.

    • Leave Comments

      I dislike any of it. They should not be tinkering with anyone’s personal feeds.