2. BACKGROUND INFO
All Facebook users see a newsfeed when the login,
comprised of friends’ posts and other trending news
A complex algorithm sifts through around 1500
“newsworthy” items and chooses approximately 300
of them to display on your news feed at any given
time
What goes into your newsfeed is is supposed to be
determined by your likes, your posts, popular topics
among your friends’ posts, etc…
3. SO WHAT’S THE ISSUE?
Turns out Facebook conducted a huge experiment
on 689,003 customers at random, with neither their
knowledge nor consent
Altered their newsfeeds- inundated some with lots
of negative posts, and others with many positive
posts
The goal: to see what effect it would have on what
the users would subsequently post themselves
after seeing the messages
4. CRITICS SAY…
Academic researches are required to get permission
from people to do psychological research on them;
Facebook’s experiment falls under this umbrella
Facebook is way off base here. They are a social media
sight, not psychology professors
Blatant disregard for people’s safety and mental health.
What if an already depressed person was randomly
selected to receive a negative newsfeed? Should have
considered more of the possible repercussions.
5. FACEBOOK’S SIDE:
Every one of the 1.28 billion Facebook users agreed to the terms and
conditions when they created an account, so consent was given, not
our fault if you didn’t read it
Picture sizes, ad placements, and other stuff are altered all the time on
your newsfeed to see what is most effective, and this is no different
Didn’t legally need to get explicit permission, and doing so would have
made people more self-conscious, altered results
Claims experiment was to users’ benefit: wanted to see if their was any
truth to the concern that seeing friends’ positive posts makes people
feel bad or left out and want to stop going on Facebook altogether
6. THE RESULTS?
“MOODS ARE CONTAGIOUS”
People who saw positive newsfeed post more
positive things
People who saw negative newsfeed post more
negative posts
Moods can be spread online, not just face to face
7. QUESTIONS
The general consensus is that the experiment
wasn’t illegal, but is it ethical? Should Facebook
have asked the users selected for the experiment
for consent?
Was this a valuable study, or psychological
manipulation? Did it pose any risks to participants,
or was it essentially harmless?
Google monitors your searches, Yahoo tracks what
articles your read. Is this ok? Is it any different from
what Facebook did?
8. WORK CITED
Goel, Vindu. "Facebook Tinkers With Users’ Emotions in News Feed
Experiment, Stirring Outcry." The New York Times 29 June 2014,
Technology sec.: B1. Web. 14 Sept. 2015.
<http://www.nytimes.com/2014/06/30/technology/facebook-tinkers-with-
users-emotions-in-news-feed-experiment-stirring-outcry.html?_r=1>.