Big Brother Is Watching

Hey Facebook, Good Job Alienating Everyone to Make a Point We Already Knew

Now that we've been manipulated for science, let’s take a look at what we've learned.
Facebook leveraged their database for scientific good. But was it worth the trouble? (Photo via Pazca)

Facebook leveraged their database for scientific good. But was it worth the trouble? (Photo via Pazca)

Facebook gave the world a new reason to think they’re a bunch of scary, omnipotent puppeteers last week when it was revealed that Facebook data scientists tinkered with users’ news feeds to study the emotional impact it would have.

It all started when the Proceedings of the National Academy of Sciences published the study by a team of data scientists working with Facebook, which went mostly unnoticed at first. Once it made its media debut — likely in this small article from NewScientist — it caught fire, and the headlines are calling the study creepy, manipulative and unethical.

The anger is understandable — no one wants to be a lab rat in a maze they didn’t even know they were running. But looking beyond the outrage, what is it that they were actually studying, and what measurable effect did the experiment have on users? As it turns out, the researchers went out of their way to barely prove something we already assumed was true.

‘Emotional Contagion’

The study takes a look at “emotional contagion,” which sounds like a form of spiritual wasting disease, but is really just a theory of how emotions transfer within groups of people.

The study claims:

“Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. Emotional contagion is well established in laboratory experiments, with people transferring positive and negative emotions to others.”

Emotional contagion is indirect. It’s not whether or not someone makes you feel good or bad directly, but whether emotions can rub off on others through public expressions — less like a direct injury and more like the spread of a disease (hence: “contagion”).

So in order to test if emotions spread on Facebook, the data scientists altered the news feeds of almost 700,000 users, sometimes making positive posts show up less frequently, sometimes doing the same with negative posts. Then they looked at whether the users with manipulated feeds started expressing themselves more positively or negatively.

But how did they process millions of posts to figure out what counts as positive or negative? As it turns out, this is where things get a little hinky.

Shoddy Methods

To figure out if the posts were “positive” or “negative,” the research team used Linguistic Inquiry and Word Count (LIWC 2007), a tool which counts the number of positive and negative words in a status and gives them scores to determine if the post is overall happy or sad.

The problem is that LIWC 2007 isn’t a very good tool for evaluating status updates. As the founder of PsychCentral points out, LIWC 2007 is meant to take on longer texts, like books, articles and transcripts. But Facebook status updates — at least the ones most of us can tolerate reading — are generally much shorter than even a paragraph.

In fact, a 2013 study by social media analytics company Quintly looked at over 13 million Facebook posts, and found that the most common post lengths are shorter than a tweet.

(Graphic via Quintly)

(Graphic via Quintly)

At this length of post, LIWC 2007 does have enough room to make good conclusions. Fewer words means fewer opportunities for measurement. Using this tool for short posts is like trying to figure out if one basketball team is better than another after only a minute of playing — there’s just not enough to go on.

Combine that with how subtle informal expression can be, and you lose out on a scientifically accurate reading of each post.

“Since the LIWC 2007 ignores these subtle realities of informal human communication,” PsychCentral writes, “so do the researchers.”

Questionable Impact

So after all the controversial manipulation of news feeds and processing status updates with the wrong kind of tool, what did the researchers discover? Adam Kramer, an author of the study who tried to do damage control with an apologetic Facebook post, wrote:

“And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it — the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.”

Another way of putting it is this: if you write the equivalent of a three-page college essay in one week in Facebook status updates, you might write a single word differently based on how everyone else is feeling.

In the paper, the authors argue that if that tiny, barely measurable effect is real, it would come out to hundreds of thousands of “expressions” a day when you consider all Facebook posts taken together. That’s hardly anything, considering Facebook has 1.28 billion active users.

Even if the effects of altering news feeds were real, the results of the study boil down to something we already know to be true: that if people read positive or negative things, it will affect them emotionally, and they might also feel good or bad, and express it. This should come as no surprise for anyone who has hate-watched a viral video proposal.

Mr. Kramer may have put it best in his own Facebook post when he said:

“In hindsight, the research benefits of the paper may not have justified all of this anxiety.”

We hope that if Facebook decides to dig into its historically unprecedented database of human behavior again, they’ll discover something more worthy of the trouble.

Follow Jack Smith IV on Twitter or via RSS. jsmith@observer.com