At the Aspen Ideas Festival yesterday Facebook’s Global Head of Policy Monika Bickert said Facebook’s emotional contagion experiments were all about innovation.
“I believe that was a week’s worth of research back in 2012,” Bickert said onstage at the festival in reference to the experiment. “And most of the research that is done on Facebook, if you walk around campus and you listen to the engineers talking, it’s all about ‘how do we make this product better,’ ‘how do we better suit the needs of the population using this product,’ and ‘how do we show them more of what they want to see and less of what they don’t want to see,’ and that’s innovation.”
The comment is quite a contrast to COO Sheryl Sandberg’s apology yesterday. Sandberg said she had misgivings about the lack of communication involved in the experiments, which were conducted without informing any of the participants. Experiment researcher Adam Kramer has also expressed that the experiment could have been done better.
For those just tuning in, Facebook conducted a study one week in 2012 that tested nearly 700,000 users reactions to positive and negative content in their Facebook news feed, as altered by Facebook, without their consent. The results were published in the Proceedings of the National Academy of Sciences. The researchers found that our peers’ emotions can indeed affect our own emotions.
Since then, we’ve witnessed a torrent of anger, apology, and explanation. The U.K. and Ireland are thinking of suing the social network — both countries have much stricter privacy guidelines than the U.S.
One of the main sentiments that has come out of Facebook in regard to this study is that the company was acting in the best interests of its users. That the study was intended to make the product better. Isn’t that what companies are supposed to do?
It’s true that Facebook was not transparent with its users. The company manipulated people without their consent and then snuck a line into their Data Use Policy allowing for data to be used in research. And people should be upset, but not because they’re powerless to stand up to the great entity that is Facebook.
Facebook has access to a glorious amount of data that under the right circumstances could be used to better understand human behavior — and that makes it an innovative tool. And, to a certain extent, Facebook’s study was “innovation,” as Bickert said. But Facebook needs to not only be transparent about what it’s doing with user data, it also needs to be held to the same guidelines that any scientific research body is held to.
And that’s the user’s job. We created this vast resource worth studying with our pictures of toddlers and puppies and life “milestones.” The user can hold the company accountable, because at the end of the day no one is forcing us to use Facebook. Sure, Bickert is concerned that legislation will stifle innovation, but the company has already shown it can’t be trusted with its own innovative tool.
Here’s the full text of Bickert’s comment yesterday:
You’ve pointed out a couple interesting issues and one is the tension between legislation and innovation. It is, in the specific incident that you’re referring to, although I’m not really the best expert and probably our public statements are the best source for information there, I believe that was a week’s worth of research back in 2012. And most of the research that is done on Facebook, if you walk around campus and you listen to the engineers talking, it’s all about ‘how do we make this product better,’ ‘how do we better suit the needs of the population using this product,’ and ‘how do we show them more of what they want to see and less of what they don’t want to see,’ and that’s innovation.
That’s the reason that when you look at Facebook or YouTube you’re always seeing new features. And that’s the reason that if you’ve got that one annoying friend from high school who always posts photos of her toddler every single day, that’s the reason that you don’t see all those photos in your news feed.
So it’s concerning when we see legislation that could possibly stifle that sort of creativity and that innovation. At the same time its incumbent upon us if we want to make sure we don’t see that legislation, it’s incumbent upon us to make sure we’re transparent about what we’re doing and that people understand exactly why we’re doing what we’re doing.
Facebook is the world’s largest social network, with over 1.39 billion monthly active users. Facebook was founded by Mark Zuckerberg in February 2004, initially as an exclusive network for Harvard students. It was a huge hit: in 2 we... read more »
Powered by VBProfiles
VB's research team is studying web-personalization... Chime in here, and we’ll share the results.