Join Transform 2021 this July 12-16. Register for the AI event of the year.

Facebook played with 700,000 of its users’ emotions by deliberately feeding them gloomy and depressing information. The world got upset, shockingly. And Facebook COO Sheryl Sandberg apologized yesterday.

Or did she?

Sheryl Sandberg

Above: Sheryl Sandberg

Image Credit:

Well, here’s what Sandberg said in her “apology.”

“It was poorly communicated …

“For that communication we apologize …

“We never meant to upset you.”

Those of you who have kids (who seldom want to own their mistakes) or those who work with the local office A-hole (who never feels the need to) will recognize that apology instantly. It’s the oh-shit-people-called-me-on-my-behavior-so-it’s-damage-control-time apology.

It’s also the I’m not sorry I did it but … I am sorry I got caught apology.

The study wasn’t poorly communicated, it wasn’t communicated at all. The global mob is not demanding an apology for the communication or lack of communication, it’s demanding an apology for Facebook playing God and doing unannounced, unsolicited, unapproved studies on ordinary, average people.

And: “We never meant to upset you?”

Really? That makes it all better, doesn’t it? Especially when you have Facebook global head of policy Monika Bickert saying that the company’s spooky mind-altering studies were all about innovation. So it was really all for our own good in the long run.

Facebook, here’s a good apology:

“I’m sorry, we screwed up. I apologize, and it won’t happen again.”

But Facebook probably doesn’t really get it yet, or understand the problem. After all, its entire modus operandi, its business model, is predicated on altering our minds by presenting a certain slice of life to us through its usually-bubbly reality bubble creator, EdgeRank.

In a very real sense, the entirety of Facebook is a massive unregulated billion-person psychology experiment, with controls and dials and buttons that are very much private, very much unknown, very much proprietary trade secrets. How do we know what Facebook has included in that algorithm that makes us happier, or grumpier, or more likely to click on ads and buy stupid shit?

Psychology experiments are regulated precisely because people matter, and consequences happen.

What if, for example, one of those 700,000 people had been suffering from depression, secretly, and as a result of seeing sad stories, took his or her life? Would Facebook have been responsible?

A real apology requires a real understanding of the real actions you took. I don’t see any of that yet from the big blue monster.


VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more
Become a member