FutureFive New Zealand - Consumer technology news & reviews from the future
Story image
Has 'emotional' studying Facebook crossed the line?
Tue, 1st Jul 2014
FYI, this story is more than a year old

Recent Facebook research which found evidence of “emotional contagion through social networks” has come in for severe criticism, with many claiming the company's methodology raises serious questions of ethics.

Recently published results of a one-week study conducted in January 2012 found that when the number of positive posts in a Facebook user’s News Feed were reduced, the user posted fewer positive posts and more negative posts.

The social network cited this finding, among other findings in the same research, as evidence that social networks can spread “emotional contagion”, that is, the wide-scale transfer of positive or negative emotions between users.

Facebook’s researchers selected 689,000 users for the project and, using Linguistic Inquiry and Word Count Software, anonymously analysed the three million posts from these users, which contained 122 million words, four million of which were positive, and 1.8 million of which were negative.

None of the words were actually seen by researchers, and Facebook justified the experiment by saying that users’ agreement with the social network’s terms and conditions when creating their account constituted “informed consent on this research”.

"But, with this research, Facebook has skated too close to the edge of online users’ willingness to share data," claims Pamela Clark-Dickson, Senior Analyst, Consumer Services, Ovum.

"Users do accept that any data that they share online is likely to be analyzed and studied by the company with which it is shared, typically for the purposes of the company providing a better service or product.

"What users don’t typically expect is that their “informed consent” would include their willingness to participate in what was essentially a sociological study that comprised the manipulation of their emotional health.

"The way in which the study was conducted could have had tragic consequences for those unwitting participants who may already have been suffering poor mental health, but who were randomly selected to receive a higher proportion than usual of negative posts."

In the UK and in the US, for example, about one in four people will suffer from a mental illness in any given 12-month period.

Given the nature of this particular research, Clark-Dickson believes Facebook should have sought explicit consent from participants prior to embarking upon it.

Adam Kramer, one of the Facebook researchers who conducted the study, has at least had the good sense to quickly issue an apology for the study, now that it has been made public, and states that the company has since improved its “internal review practices.”

In his own words: “In hindsight, the research benefits of the paper may not have justified all of this anxiety.”

But according to Clark-Dickson, the irony is that the way that Facebook has tried to address its concerns about whether users would avoid visiting Facebook because of exposure to friends’ negativity.

"The likelihood is that a proportion of Facebook users may now well avoid visiting the site because of their own concerns about whether they will be unknowingly involved in another potentially harmful social experiment," Clark-Dickson adds.

"This is an outcome that could have been avoided had Facebook been more transparent at the outset.”