Facebook’s policy head calls mood-manipulation experiment ‘innovation’

At the Aspen Ideas Festival yesterday Facebook’s Global Head of Policy Monika Bickert said Facebook’s emotional contagion experiments were all about innovation.

“I believe that was a week’s worth of research back in 2012,” Bickert said onstage at the festival in reference to the experiment. “And most of the research that is done on Facebook, if you walk around campus and you listen to the engineers talking, it’s all about ‘how do we make this product better,’ ‘how do we better suit the needs of the population using this product,’ and ‘how do we show them more of what they want to see and less of what they don’t want to see,’ and that’s innovation.”

The comment is quite a contrast to COO Sheryl Sandberg’s apology yesterday. Sandberg said she had misgivings about the lack of communication involved in the experiments, which were conducted without informing any of the participants. Experiment researcher Adam Kramer has also expressed that the experiment could have been done better.

For those just tuning in, Facebook conducted a study one week in 2012 that tested nearly 700,000 users reactions to positive and negative content in their Facebook news feed, as altered by Facebook, without their consent. The results were published in the Proceedings of the National Academy of Sciences. The researchers found that our peers’ emotions can indeed affect our own emotions.

Since then, we’ve witnessed a torrent of anger, apology, and explanation. The U.K. and Ireland are thinking of suing the social network — both countries have much stricter privacy guidelines than the U.S.

One of the main sentiments that has come out of Facebook in regard to this study is that the company was acting in the best interests of its users. That the study was intended to make the product better. Isn’t that what companies are supposed to do?

It’s true that Facebook was not transparent with its users. The company manipulated people without their consent and then snuck a line into their Data Use Policy allowing for data to be used in research. And people should be upset, but not because they’re powerless to stand up to the great entity that is Facebook.

Facebook has access to a glorious amount of data that under the right circumstances could be used to better understand human behavior — and that makes it an innovative tool. And, to a certain extent, Facebook’s study was “innovation,” as Bickert said. But Facebook needs to not only be transparent about what it’s doing with user data, it also needs to be held to the same guidelines that any scientific research body is held to.

And that’s the user’s job. We created this vast resource worth studying with our pictures of toddlers and puppies and life “milestones.” The user can hold the company accountable, because at the end of the day no one is forcing us to use Facebook. Sure, Bickert is concerned that legislation will stifle innovation, but the company has already shown it can’t be trusted with its own innovative tool.

Here’s the full text of Bickert’s comment yesterday:

You’ve pointed out a couple interesting issues and one is the tension between legislation and innovation.  It is, in the specific incident that you’re referring to, although I’m not really the best expert and probably our public statements are the best source for information there, I believe that was a week’s worth of research back in 2012. And most of the research that is done on Facebook, if you walk around campus and you listen to the engineers talking, it’s all about ‘how do we make this product better,’ ‘how do we better suit the needs of the population using this product,’ and ‘how do we show them more of what they want to see and less of what they don’t want to see,’ and that’s innovation.

That’s the reason that when you look at Facebook or YouTube you’re always seeing new features. And that’s the reason that if you’ve got that one annoying friend from high school who always posts photos of her toddler every single day, that’s the reason that you don’t see all those photos in your news feed.

So it’s concerning when we see legislation that could possibly stifle that sort of creativity and that innovation. At the same time its incumbent upon us if we want to make sure we don’t see that legislation, it’s incumbent upon us to make sure we’re transparent about what we’re doing and that people understand exactly why we’re doing what we’re doing.

More information:

Facebook is the world’s largest social network, with over 1.15 billion monthly active users. Facebook was founded by Mark Zuckerberg in February 2004, initially as an exclusive network for Harvard students. It was a huge hit: in 2 w... read more »

Powered by VBProfiles


VentureBeat is studying the state of marketing technology. Chime in, and we’ll share the data.
22 comments
Martin Miller
Martin Miller

it means they don't give a shit how you feel...and never will...

Anthony Hesse
Anthony Hesse

You're still here so I guess not. So what do you do for a living considering you're brain dead?

Matthieu Lebas
Matthieu Lebas

they are not sorry, they want us to enter the matrix to sell more ads and control our behaviors, that's the aim of Zuck: to become The Architect

Masthan Sheik
Masthan Sheik

So many special kind of idiots, okay with being manipulated. They don't care, they don't mind and they think what was done was our fault for accepting some terms and conditions which actually no one read.

Thomas Chenhall
Thomas Chenhall

Yeah they're sorry, sorry as the CIA for starting MK ULTRA against the will of JFK. Inadvertently setting into motion the deaths of JFK, MLK, John Lennon, and countless other less-famous stubbornauts.

Michael Donovan
Michael Donovan

There is a reason why research is supposed to consider ramifications, and consider the ultimate ethical considerations.

Gene Warren
Gene Warren

Nice to see Godwin's Law already in effect - "Death camp doctors?" Really?

Gene Warren
Gene Warren

I really don't think what they did rose to the level of requiring informed consent. Facebook constantly, publicly monkeys around with privacy settings, ad placement, how your news feed is organized, which friends you see and which you don't - all of this is common knowledge and loudly debated, complained about etc across all media channels. Nothing that was done here has any greater potential to cause harm than all of those (and I don't beleive, for one, that they properly controlled for those variables in the experimental design).

David Siorpaes
David Siorpaes

What's so wrong with what they did? I couldn't care less, honestly.

Eric Marshall
Eric Marshall

Well, they didn't offer compensation to their test subjects.

Jeremy E. Simpson
Jeremy E. Simpson

"Innovation"...my ass. Sounds like Psy Ops to me, aka: Brainwashing and strong potential for tort for hard$hip

Ariadna Jacob
Ariadna Jacob

Alan Stewart I wonder if it would have been different if they only posted positive stuff on people's feeds :) I would pay for that service haha

Jodi Kule-Nagel
Jodi Kule-Nagel

Enough of this! It was done.Did anything bad happen explosions, mass suicides? All the FB haters log off.

Joshua Darlington
Joshua Darlington

Employee of the month? Or some one with a casual acceptance ... err a passionate defender of predatory mind hacking?

Andrew Stover
Andrew Stover

Except you get to leave Facebook at any time you choose. It's their product. You signed the terms by using it. Deal with it or move on.

Joshua Darlington
Joshua Darlington

Those death camp doctors were innovators too. Sociapaths running a social network. Almost comedy.

Toby Farren
Toby Farren

There's nothing wrong with what they did honestly. Did it negatively impact the people who were 'experimented' on?