Facebook’s little social experiment got you bummed out? Get over it

Image Credit: Shutterstock

OP-ED — You would think by the reaction some are having to it that Facebook’s recent admission that it experimented with some people’s feeds is tantamount to Watergate.

You would think there had been some terrible violation of privacy or a breach of confidential user data. Instead, 700,000 people read a slightly different version of their news feed than the rest of us.

Here’s what happened: Over the course of a one-week period back in 2012, Facebook altered the balance of content in the news feeds of 700,000 of their users. They showed some people more “positive” content while others saw more “negative” content. According to the published paper,  “Posts were determined to be positive or negative if they contained at least one positive or negative word, as defined by Linguistic Inquiry and Word Count software.”

The goal was to determine what (if anything) would happen to these two groups. Would they express more positive emotions after seeing lots of positive content? Would they express more negatives ones if the opposite were true? In short: Could their subsequent behavior be seen as “a form of emotional contagion.”

Now you might think that in order to elicit a really strong and measurable set of reactions, Facebook inserted stories of small children being abused or an especially adorable set of cat videos. After all, if we were going to be mad at Facebook it should be because they showed us content we wouldn’t have otherwise seen, in an attempt to screw with our emotional state, right?

But that didn’t happen.

Or at least, it might have, but Facebook didn’t create that shared content. The friends of the 700,000 unwitting victims participants created it.

That’s right. Facebook merely adjusted the balance of content these users would have already seen if they looked at each and every one of the items their friends had shared over the one-week period.

“But how could they do that?” I hear you say. “I never agreed to be part of such an ‘experiment’.”

Actually, you did. When you signed up for Facebook. And if you’ve been a member for more than a few months, I guarantee this was far from the only time Facebook has done it.

In fact, apart from the specifically emotionally oriented and highly organized nature of this experiment, almost every website worth its salt has performed similar tests.

Editorial publications will try out different approaches to their content to see what gets readers to click. They’ll try straight-up reportage styles that state the facts and nothing but the facts. They’ll try emotionally charged headlines. They’ll even imitate the Buzzfeed style — you know the one: “You Won’t Believe The Epic Thing This Guy Did Right Before This Happened.” If you’re like me and you tend to throw up in your mouth every time you read one of these digital come-ons, you can thank the testing that went in to discovering that formula. It didn’t happen by accident.

According to Wired’s David Rowan, “If there is a science to BuzzFeed’s content strategy, it is built on obsessive measurement. The data-science team uses machine learning to predict which stories might spread; the design team keeps iterating the user interface through A/B testing and analytics.” You can read the full article here, though be prepared for more eye-openers.

So if this kind of thing is happening all of the time — and make no mistake, it is — why are people so upset by Facebook’s emotional A/B testing?

I can only assume it’s because people found out.

We don’t like the idea that our every move online is being studied, effectively turning us into digital guinea pigs. That’s a pretty reasonable reaction to have when you have a reasonable expectation of privacy.

Which is why the realization that Facebook was tracking its users across multiple websites through the use of the their ubiquitous “like” button even when the user was logged out of Facebook, came as a shock.

But Facebook’s experiment happened on Facebook’s own site, while its users were logged on, and only presented material posted by other users—specifically the friends of the users being studied. Where exactly is the expectation of privacy under these conditions?

And yet, it’s precisely for violations of privacy that Facebook is now being investigated.

Here’s how the Financial Times summarizes it:

The company’s policy did not explicitly disclose that it used personal data in research studies, […] the company had said that it used data “in connection with the services and features we provide”, without specifying research.

So even though the research in question was all about better understanding how its users interact with “the services and features we provide,” apparently they need to state that such research is happening. Good grief.

I suppose our municipalities must now inform drivers that they might be subject to research when they use city streets. After all, how else could we decide where new traffic lights should go?

Here’s a little advice for those who are still feeling that Facebook has once again manipulated them and ignored their privacy: Get over it. This kind of testing is happening all over the web and in real life too.

The negative content that was populated into people’s feeds came from their friends. The positive stuff too. All Facebook did was re-organize it. Did you know you could do the same?  There’s a little drop-down arrow beside the News Feed menu item – simply switch it from Top Stories to Most Recent.

And if you’re still bummed out after reading a week’s worth of negative crap? Don’t blame Facebook. Blame your friends.


Mobile developer or publisher? VentureBeat is studying mobile app analytics. Fill out our 5-minute survey, and we'll share the data with you.
27 comments
Tariq Aziz
Tariq Aziz

Violating corporate "intellectual property rights" by downloading torrents got you executives bummed out?  Get over it!

Brian Jones
Brian Jones

Good article; I don't usually go in for op/Ed pieces because they tend to be rants just to stir up overreactions. The only minor area I'd disagree with the author is why people got so upset, i think it was more likely because some "news" feed told them to. Some person got hold of this "you've been horribly wronged", by the free service you use angle to an otherwise modern technology take on something most people figure out in school; hang out in negative circles and you'll be more negative. If facebook would give me the option of filtering out the negative post from my friends, I, like most people, would happily click it. When my real friends need to reach out when tragedy strikes, they text or call me. For those of you who quit Facebook over this, well done you're in control of your life, that wasn't an area you wanted them fine tuning. For those who are mad but can't cut ties with Facebook, do like the author said, realize that they didn't hurt you, they're striving to improve their business. I for one am glad that Mark Zukerburg and company are still clearly at the wheel. They could still be trying to get us to use the poke feature.

Carlos Alberto
Carlos Alberto

 Isso não e tão ruim, pois as pessoas mentem umas para as outras o tempo todo na vida real e até na TV.

Quem esta acostumado com mentiras como eu, pode até ficar triste ou não, mas depois eu vejo que isso e uma forma de mostrar que eles queriam ver a reação das pessoas depois de falar a "possível verdade ou mentira disfarçada de verdade" do que e ou não postado na internet.

O importante e que isso não faça as pessoas se matarem ou desenvolverem doenças mentais.

sb birdette
sb birdette

Nah. I'm not bummed out. I solved the problem by cancelling my Facebook account permanently. Oddly enough, I couldn't be happier. How's that for an experiment?

Lou Zipping
Lou Zipping

I got over it!


By closing my Facebook Account. 


Adios!

Lizzy Gail
Lizzy Gail

The amount of bitterness in the comments is ridiculous. 


You agreed to the terms and conditions when you signed up for Facebook. If you think that just because you set all of your privacy settings to the strictest options possible you can't be touched by any changes (experimental or otherwise), you are extremely illiterate when it comes to the internet. Just because they didn't actually spell it out for you and say "we might experiment on you to see how we can make improvements to our website in order to earn us more money" it doesn't mean that it wasn't going to happen. If you really think charges need to be pressed on something like this, you are just another sue happy crazy. They didn't break the law. They used the power that YOU signed over to them. 


If you don't like stuff like this happening to you, just get off the internet. You cannot avoid it as long as you have an online presence. Google is constantly experimenting on people, but because no one finds out what the exact experiments are, no one cares. 

The one thing that surprises me about this experiment is that it was so minor. A post had to contain one "negative" or one "positive" word to be tagged as one of them. So if one of your friends said "This SUCKS. I CAN'T do that thing I wanted to do" it would have been tagged as negative. "Sucks" and "can't" are negative words. Is that really going to send you over the edge? If someone posted "I'm so HAPPY" it would have been tagged as positive. 
Facebook didn't fabricate anything. They didn't hack accounts and post as your friends. The only negative and/or positive stuff that would have appeared on your news feed was stuff that your friends decided to post. So if you're mad that for one week (that you more than likely don't remember IF it happened to you) all of your friends decided to post about how terrible their days had been, then maybe you should tell your friends to stop being so dang negative. 



I definitely think that Facebook should notify those who were involved (if they still have active accounts). But I'm betting that 75% won't even care that they were part of this. It was so long ago and had no huge effect*. 


*If you can find evidence that this week of social experimenting caused someone to harm themselves or others, you should probably contact the Better Business Bureau. But don't yell at me about hypothetical situations. 




Also if you are spending money on blocking a website from appearing on your browser, you are probably spending way too much money on something that will have enough holes for the content to leak out. There's this concept of not clicking on links you don't want to visit. 

Ralph Smith
Ralph Smith

Get over it???Facebook  should be  brought up on charges for violation of  rights to privacy Lets see if  the scum bad network gets  away  with this  FB thinks its above the law I hate people  like that

Swag Valance
Swag Valance

It's Facebook. The only people who care have no lives otherwise, so who cares?

George Hyatt
George Hyatt

Zuckerberg is on the payroll of the C.I.A., The intelligence community doesn't have to work hard to find out EVERYTHING about you (and your stupid kids).  It's all right there. Oh, your profile is private?--like THAT really matters.


Some advice (and for me too): Get off your ass and go outside. Wanna check on your family: Call them...or better yet write them a letter.


let·ter   [let-er]

noun

a written or printed communication addressed to a person or organization and usually transmitted by mail.


For the young ones: Put your 2nd/3rd grade learning retention to the test and write a good old fashioned letter to someone. It comes across more like you actually give a crap about that person.
Yes, he is on the payroll of the C.I.A. And if you're ok with giving them your data, by all means
continue being the sheep ("baa-baa"..or "mooo" for you Beef-what's for dinner lovers).
This has been a PSA, and please for the love of God that doesn't exist other than your humanly faulted texts of  the so-called Word of God (Bible, Torah, Quran) (You'd think two or more religions somebody would get it right..but nooooo...they all gotta be different). Please take your kids off the damned computer/video games. Make him/her ride their bike for a good part of the day, do more of the crap you did when growing up, or your parents did. I'm getting sick and tired of seeing all these dopey people.

G. Garcia
G. Garcia

 I recommend anyone who wants to 'get over it' delete their Facebook account.  That would CERTAINLY get you 'over it.'


For my money, I'll be blocking Venturebeat.com from ever appearing on my browser again -- I guess I got over you, too.  ;)

shift me
shift me

This is the same kind of reasoning that lead to Reservation Hop (the site that hogs restaurant reservations and resells them). Emotional manipulation, hoarding, etc. are socially not acceptable. Most folks probably wouldn't care about an experiment to drive ad revenue, but when it comes to emotions, that's where it crosses the line for most folks.

David Spencer
David Spencer

"Get over it" isn't a valid point of debate so much as it's an attempt to subvert debate altogether in favor of the status quo ante. 

Dan Gulphine
Dan Gulphine

It isn't Facebook that has breached ethics, it is the social scientist who did.  The social scientists, those affiliated with universities, are held to a much higher standard, an ethical one, beyond the one Facebook has.


One has to take a stand, as a social scientist, on informed consent because even though one particular experiment may not have caused harm, those kinds of experiments, where emotions are manipulated, can cause harm.  This requires oversight and adherence to protocols designed to insure the safety of subjects of experimental manipulations.



bbeat beats
bbeat beats

- - - Wrong.

 Zuckerberg has a long track record demonstrating his lack of empathy and so it comes as no surprise that he has an inability to differentiate right vs wrong. Also not surprising that Zuckerberg would surround himself with equally clueless sycophants like Sandberg.

 If you do not recognize the harm in deliberating manipulating, and potentially damaging, the emotions of thousands of strangers then it is time for you to learn what the rest of us learned as children: other people's feelings and emotions are every bit as important as yours.


Empathy: it's what's for dinner (you absurd buffoon, 'get over it' indeed).

Craig Smith
Craig Smith

Your attempt to spin-doctor the facts regarding Facebook's 'study' are quite transparent. 
You aren't fooling anyone.

Peter Aretin
Peter Aretin

@sb birdette 

It doesn't matter. Data brokers will continue trading your personal data as they were undoubtedly doing before you signed up for Facebook.  You had to provide some personal data just to comment.

Ralph Smith
Ralph Smith

@Swag Valance when we  stop caring  about others  rights  that  will be  the  time  they come after yours for something you are  doing  We'll see  then if you care so little!

Tariq Aziz
Tariq Aziz

@George Hyatt  It's just came out this weekend that the NSA is collecting baby photos - BABY PHOTOS - of people.  Wow.


Time to cancel all the "social media" accounts and/or create fake ones and fill their databases with false crap.

Tariq Aziz
Tariq Aziz

@Peter Aretin  That's why some of us provide FAKE information.  Deliberate culture jamming.  LOL

Lizzy Gail
Lizzy Gail

@Ralph Smith @Swag Valance If you have a Facebook account, you AGREED TO TERMS AND CONDITIONS that effectively give away your right to privacy from the Facebook company. With those terms and conditions they can alter your account in whatever way they see fit. If that means deleting pictures that you think are harmless, so be it. If that means making slightly positive posts more common than slightly negative posts, fine. 


They didn't stomp on any of your rights. 

George Hyatt
George Hyatt

@Tariq Aziz I wonder if they are playing with/perfecting photo-aging technology. Makes sense. "Here's cute little Johnny when he was 1 year old..here is what he should look like when he's 40 and a domestic terrorist" [due to his own country's making]. "He's made off color comments about the government before..lets ADD (emphasis on add) to his file. "