It’s emerged over the weekend that social network Facebook conducted an experiment with the help of two US universities into its users’ emotions and the effect of posts on them. The study, which was conducted two years ago, ran for a week and the site deliberately skewed the appearance of users’ newsfeeds to look at the effect of positive and negative posts.
Some people were shown more of their friends posts containing positive statuses, whilst other were shown more that were negative. The research found that there’s a direct correlation between what appears in the newsfeed and how happy or sad it makes us feel. It was found that in the event of seeing more happy posts, users tended to post positive updates themselves, whilst the opposite happened with ‘sad’ posts.
The news has of course prompted a fresh outrage, with many people coming online to encourage others to delete their Facebook account. It seems that not everyone is happy about being manipulated without their knowledge, even if it is in the name of science.
The study affected around 700,000 people in January 2012 and the results were published in the high-end scientific publication Proceedings of the National Academy of Science. Whilst the research was “almost certainly legal”, this is the first time that users have actually been manipulated during such research on the site. Facebook users agree to their data being used for research when they sign up to the site, as it’s listed in the terms of service.
However, unsurprisingly, privacy advocates are questioning whether or not the move is ethical as it had a direct effect on the emotions of the site’s users.
A Facebook spokesman told The Atlantic that “none of the data used was associated with a specific person’s Facebook account.” He went on to say that the company carry out such research as a means of getting to know its users and what they want better.
“A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely,” he added.
The study found that emotional states can be directly transferred to others by “emotional contagion”, leading people to alter their own behaviour as a reaction to the posts of others. According to Susan Fiske, one of the Princeton University psychology professors who edited the publication, the study was approved by an institutional review board that vets research involving people to make sure that it’s ethical.
However, there seems to be some questions surrounding what was actually approved – the use of the data or the experiment itself.
This year has seen a few calls for people to delete the social network due to its privacy policies, but the site remains as popular as ever. It has to be considered that the media attention that the experiment is getting is due to the rather predictable response that it was always going to get from around the web.
It’s well known that the media can be used to manipulate the masses and frequently is. With this in mind, is the outcry surrounding the Facebook experiment nothing more than a knee-jerk reaction with the media jumping on the ‘sell more stories’ bandwagon in an equally predictable manner.
Whilst it’s unlikely that many users will have read the small print when it comes to how their data is used for research, it is there and it is something that users agree to when they sign up. However, the ethical questions exist around the fact that Facebook actually attempted and succeeded in manipulating users and changing actions that they may have taken.
No doubt there will be an investigation into the ethics surrounding the affair and Facebook will of course lose a small proportion of users. However, one of paradoxes surrounding the site is that everyone seems to love to complain about privacy, but few people actually do anything about it.
It will be interesting to see how this latest news pans out for the social network; one of the authors of the research has already apologised to the unsuspecting user base in case they are offended.
Adam Kramer of Facebook told the BBC: "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out".
"At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."
He went on to say that the company didn’t "clearly state [their] motivations in the paper".
"I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused."
Stable doors and horses come to mind …
What do you think of the news? Do you think that Facebook has acted unethically or is it something that we should have come to expect? Let us know in the comments below!