Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Facebook via Shutterstock

Facebook 'manipulated users emotions' in secret study

For one week in 2012 Facebook tampered with the algorithm to see how it affected users.

A STUDY DETAILING how Facebook secretly manipulated the news feed of some 700,000 users to study “emotional contagion” has prompted anger on social media.

For one week in 2012 Facebook tampered with the algorithm used to place posts into user news feeds to study how this affected their mood.

The study, conducted by researchers affiliated with Facebook, Cornell University, and the University of California at San Francisco, appeared in the June 17 edition of the Proceedings of the National Academy of Sciences.

The researchers wanted to see if the number of positive, or negative, words in messages they read affected whether users then posted positive or negative content in their status updates.

Indeed, after the exposure the manipulated users began to use negative or positive words in their updates depending on what they were exposed to.

“Emotional contagion”

“Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness,” the study authors wrote.

“These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

While other studies have used metadata to study trends, this appears to be unique because it manipulates the data to see if there is a reaction.

Ethical?

The study was legal according to Facebook’s rules — but was it ethical?

Susan Fiske, a Princeton University professor who edited the report for publication, told The Atlantic that she was concerned about the research and contacted the authors.

They in turn said that their institutional review boards approved the research “on the grounds that Facebook apparently manipulates people’s News Feeds all the time”.

Fiske admitted to being “a little creeped out” by the study.

Facebook told The Atlantic that they “carefully consider” their research, and have “a strong internal review process”.

Facebook, the world’s biggest social network, says it has more than one billion active users.

- © AFP, 2014

Read: It’s official – everybody hates happy couples on Facebook >

Author
AFP
View 30 comments
Close
30 Comments
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.
    JournalTv
    News in 60 seconds