Facebook is in the news again, from the Wall Street Journal to the Guardian UK. The latest uproar was created when researchers from Cornell University and the University of California published their findings based on a study they completed in conjunction with Facebook. News coverage of the study is generally limited to a narrow spectrum of reactions ranging from disgust to righteous indignation. So what’s all the fuss about?
Here’s the short version: The researchers were looking to establish a correlation between the frequency of positive or negative Facebook posts and the moods of the people reading those posts. So for one week in 2012, Facebook tweaked the algorithm that determined the news feeds of about 680,000 randomly chosen Facebook users. Instead of filtering and ranking content according to user interest, the algorithm filtered and ranked posts based on the negative or positive wording they contained.
Is this any way to treat your friends?
According to the study, Facebook users who received mostly negative posts exhibited similar negativity in their posts and tended to post less frequently. Those whose news feeds were mostly positive reflected that positivity in their own Facebook activity. Researchers made a rather grand conclusion that they had proven what they called “massive-scale emotional contagion via social networks.”
Maybe it was those words that got people wound up, because the study itself seems a bit obvious and maybe even weak. But whatever the reason, media reaction has been fast and furious. News articles refer to “human lab rats” and “thought control.” And in the UK, a senior MP has called for a parliamentary investigation.
Maybe South Park was right
The controversy centers on two issues: whether it was ethical to manipulate the emotions of Facebook users without their express permission, and whether any rules or laws were violated by failing to get that permission.
Is it too cynical to compare this to the South Park “Human Centipad” episode? Let’s face it: everybody who uses Facebook has agreed to their terms and conditions. As part of Facebook’s current terms of service, users agree to make their personal data available for “data analysis, testing [and] research.” Perhaps even more to the point, this kind of testing goes on all the time — only usually it’s conducted by marketers and SEOs.
Just two factors make this social experiment differ from the usual A/B variant testing that we’re subject to nearly every time we go online: First, the research was completed by a public/private partnership. And second, and probably more importantly, the results were made very public.
The only thing that’s really surprising about this brouhaha is that anyone was surprised by it. Facebook was turned into a marketing machine a long time ago. As such, their algorithms are more than likely tweaked all the time without users’ permission or direct knowledge. And Facebook isn’t alone — your online presence is being tracked constantly by any number of marketers, not to mention the omnipresent Google.
Your data has already become a commodity. If you don’t like what Facebook is doing with it, get rid of Facebook. You’ve probably got better things to do with your time anyway. Read the full study, and add your comments at the bottom of the page — is this an amoral invasion of your privacy, or just business as usual?