It was creepy enough when Facebook announced how it knew when you were about to get in a relationship – or were breaking up. Then it bought WhatsApp to add to its growing avalanche of data on users. (Read also: Should you be creeped out when Facebook buys WhatsApp?")
To quote Aunt May in the movie Spider-Man: “With great power comes great responsibility” – but it looks like researchers may have missed that when they published this study.
In a study conducted across nearly 700,000 users, Facebook, with Cornell and the University of California, tested the extent to which people were affected by the content showing up in their Newsfeeds.
For one week in January 2012, some users saw mainly positive stories, while others were fed depressing posts. The result? Scientists concluded this was “experimental evidence for massive-scale contagion via social networks”, said <em>The Guardian</em>.
Here’s one example of a finding from the study: “We also observed a withdrawal effect: People who were exposed to fewer emotional posts (of either valence) in their Newsfeed were less expressive overall on the following days, addressing the question about how emotional expression affects social engagement online.”
The experiments took place for one week on 11-18 January, 2012). Participants were randomly selected based on their user ID, said the study. For the full report by researchers, click here.
It has also sparked alarm among authorities.
According to the same article by The Guardian, a senior British MP called for a parliamentary investigation into how Facebook and other social networks manipulated emotional and psychological responses of users by editing information supplied to them.
The article also quoted several tweets by Clay Johnson, the co-founder of a digital agency that built and managed Barack Obama’s online campaign in 2008.
“Could the CIA incite revolution in Sudan by pressuring Facebook to promote discontent? Should that be legal? Could Mark Zuckerberg swing an election by promoting Upworthy [a website aggregating viral content] posts two weeks beforehand? Should that be legal?” Johnson said.
Adam Kramer, the data scientist leading the research, wrote this post to clarify his stand:
"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper." Read the whole post here.
“This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely,” a Facebook spokesperson told Marketing.
Facebook’s chief operating officer Sheryl Sandberg also admitted in an interview with NDTV that it had communicated poorly on the controversial psychological experiment, but denied any attempt to control the emotions of users.
“We communicated very badly on the emotions study ... we hope users understand that we care about their privacy ... we want to be transparent and give users control.”
Senior analyst, Forrester, Fatemeh Khatibloo wrote in a blogpost that while Facebook’s study crosses ethical lines, the data use is likely legitimate. “Consumers are understandably outraged by why they perceive as an abuse of their postings. But Facebook’s Data Use Policy explicitly allows the firm to use data for internal research purposes. Still, the potential for users to abandon Facebook is real,” said Khatibloo.
Facebook has novel data to analyze, and in the long term, this could change marketing practices significantly. The kinds of data that Facebook is starting to exploit are highly unique. It could actually combine evergreen affinities with contextually specific emotional states to change how brands buy media and measure performance, added Khatibloo. As for short term implications, however, could see outraged users leaving the site in droves, she concluded.