Advertisement

Facebook's Creepy Case Of Emotional Contagion

Facebook manipulated the moods of hundreds of thousands of unsuspecting users. Steve Brykman wants to return the favor. Pictured: As a private company, Facebook did not have to adhere to rules on the use of human subjects. (Jeff Chiu/AP)
Facebook manipulated the moods of hundreds of thousands of unsuspecting users. Steve Brykman wants to return the favor. Pictured: As a private company, Facebook did not have to adhere to rules on the use of human subjects. (Jeff Chiu/AP)

In March, Proceedings of the National Academy of Sciences published a report about a week-long psychological experiment conducted in 2012 by Facebook and Cornell University on a random sample of 689,000 Facebook users. Engineers for the social networking website manipulated the news feeds of more than three million posts to determine whether or not the presence or absence of emotional words affected a user’s feelings and subsequent posts. Did negative posts beget more negative posts? Positive posts more positive ones? The answer is yes. The experiment also considered whether such emotional priming influenced what Facebook users chose to “like.” Translation: ca-ching.

Here’s an excerpt from the PNAS abstract:

In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.

Naturally, folks got upset when they learned they’d been secretly turned into virtual lab rats. Facebook asserts that the experiment was covered by their terms and conditions, which fills an entire website of its own.

The legalese is both dense and ambiguous. Here’s a sample:

We use the information we receive about you in connection with the services and features we provide to you and other users like your friends, our partners, the advertisers that purchase ads on the site, and the developers that build the games, applications, and websites you use. For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you…to make suggestions to you and other users on Facebook…for internal operations, including troubleshooting, data analysis, testing, research and service improvement.

None of which makes what Facebook did any less creepy, particularly as it appears that they can continue to mess with us in perpetuity:

Granting us permission to use your information not only allows us to provide Facebook as it exists today, but it also allows us to provide you with innovative features and services we develop in the future that use the information we receive about you in new ways.

Earlier this month, Facebook’s chief operating officer, Sheryl Sandberg, apologized. Not for the experiment, mind you, just for for not telling users about it. “This was part of ongoing research companies do to test different products,” she said. “…it was poorly communicated…And for that communication we apologize. We never meant to upset you.”

Except that’s exactly what they meant to do. Upset some of us, elate others, and track (and monetize, no doubt) the results.

Now, what if I told you that two years prior to the secret psychology study, Facebook conducted an even shadier experiment: to determine whether or not they could alter the outcome of elections. It’s true. As it turns out, not only does Facebook have the power to make you “like” something, they also have the power to make you vote.

As reported last month in the Guardian newspaper:

On presidential election day 2010 [Facebook] offered one group in the US a graphic with a link to find nearby polling stations, along with a button that would let you announce that you'd voted, and the profile photos of six other of your "friends" who had already done so. Users shown that page were 0.39% more likely to vote than those in the "control" group, who hadn't seen the link, button or photos. The researchers reckoned they'd mobilised [sic] 60,000 voters—and that the ripple effect caused a total of 340,000 extra votes. That's significantly more than George Bush won by in 2000, where Florida hung on just 537 votes.

Why hasn’t this revelation also caused a stir? Perhaps because encouraging voter turnout is generally considered to be a good thing. But if Facebook can get you to vote, chances are they can get you to vote for a specific candidate.

This is unethical, of course. And so is the fact that Facebook devised these experiments in the first place. That they don’t appear contrite for having subjected unwitting users to these experiments suggests that, a.) they dig messing with the general public’s heads; b.) they see little wrong in manipulating our thoughts on a mass scale; and c.) they’ll do it again. (They may be doing it right now!)

So the next time you’re on Facebook, keep this in mind: They can make you think what they want. They can make you feel what they want. They can make you want what they want you to want.

As a mode for spreading propaganda, this is genius. Or evil, depending on how you look at it. With virtually unlimited user-generated content, Facebook has the ability to let you and your friends spread the word for them. Just ask Brian Boland, Facebook’s vice president of Ads Product Marketing, who wrote, “On average, there are 1,500 stories that could appear in a person’s News Feed each time they log onto Facebook. For people with lots of friends and Page likes, as many as 15,000 potential stories could appear any time they log on.”

As it turns out, not only does Facebook have the power to make you “like” something, they also have the power to make you vote.

Who doesn’t trust their friends? That’s the whole point of friendship. All that’s left for Facebook to do is pick and choose from all those posts.

I might suggest that Facebook users consider jumping ship and returning to their regularly-scheduled lives. To which they might respond, “But Steve, surely you can’t expect us to stop using the social network. It’s too good!” I understand. So here’s another option, a sure-fire way to influence Facebook executives’ collective mood and register precisely what we, the previously unsuspecting, think of their secret experiments in mind control: Sell your Facebook stock, then post that you did so to your feed. Throw in a frowny emoji or two so they get the message. That’s one news item you can bet they won’t be promoting.


Related:

Headshot of Steven Brykman

Steven Brykman Cognoscenti contributor
Award-winning humorist and former National Lampoon editor, Steven Brykman, is currently a Mobile Strategist for Propelics. His work has appeared in Playboy, Nerve, The Huffington Post and The New Yorker where he was featured in Talk of the Town.

More…

Advertisement

More from WBUR

Listen Live
Close