Successful CMOs achieve growth by leveraging technology. Join us for GrowthBeat Summit on June 1-2 in Boston
, where we'll discuss how to merge creativity with technology to drive growth. Space is limited. Request your personal invitation here
Privacy, Facebook’s critical issue — and Achilles heel — has come to the fore again. After the company unveiled a program that automatically shared data with special partners and changed the way it tracks people’s interests last month, bloggers and users are up in arms.
For Facebook’s youthful founder and CEO, Mark Zuckerberg, who turns 26 on Friday, it’s a conundrum.
The company scheduled an all-hands meeting for 4 PM Pacific time on Thursday to discuss the issues, the company confirmed: a Q&A for Facebook’s thousand-plus employees to clear the air, rather than an announcement of new plans.
No, this isn’t a reaction to mass protests consuming 10 percent of the user base like in 2006 with the introduction of the news feed.
The problem is that Facebook’s changes and the reaction to them over the past year are turning out to be more insidious and dangerous to the brand in the long run. A few high-profile influencers in the tech community like Google’s Matt Cutts and GDGT’s Peter Rojas cut the cord and closed their accounts recently, while a New York-based group of hackers raised more than $100,000 to come up with a more private alternative to the social network. Facebook’s vice president of communications and public policy, Elliot Schrage, came forward in The New York Times for a public question-and-answer saying, “Everything is opt-in on Facebook. Participating in the service is a choice.”
Sending Schrage to the Times isn’t enough.
Mark Zuckerberg needs to come forward and explain what he truly and genuinely believes about privacy. Why? Because even as the company has created ever-more-detailed privacy controls, Facebook’s moves can appear disingenuous (even if they’re not). Why spend months designing a privacy overhaul and default most of the user base to public? Why do people have to choose between an emptier profile and making their likes and interests public? Where is this instant personalization project going?
It feels like a slippery slope. To where? Facebook’s users don’t know. So the company should just be frank and that message should come from the top.
Mark should write a memo like the one Steve Jobs wrote explaining Apple’s deep aversion to Adobe’s Flash.
Or something like this Googley memo from senior vice president Jonathan Rosenberg explaining Google’s philosophies on openness.
Or something like this piece Sergey Brin wrote to explain the mission behind Google Books in The New York Times last fall.
Or how about this one? “Calm Down. Breathe. We hear you.” That was from a very different Zuckerberg four years ago when the company was much smaller (and more humble).
Facebook has a powerful mission to make the world more open and connected, but the values that Zuckerberg stands for aren’t clear. And that’s the problem. With other great technology companies (which is what Facebook aspires to be), the ideals have been apparent. Steve Jobs made technology an art form. Bill Gates wanted to put a computer in every home; now he’s out fighting global poverty. Even though Google faces constant public criticism, its cheesy “Don’t be evil” motto protects it. Sergey Brin and Larry Page have cultivated a public image of themselves as two brilliant technical minds who care about pursuing society’s best interests. So when they overstep, it’s often seen as a well-intentioned mistake.
When Zuckerberg got on stage last month at f8, the company’s periodic conference for developers, his speech was far-reaching. Visionary, even. But he expressed little empathy for the millions of users that have come to entrust the company with their most personal details. He wasn’t listening. At the company’s sixth birthday party when it launched another redesign, some employees even made fun of bewildered and anxious status updates from users projected on a screen.
Zuckerberg has created a corporate culture where Facebook knows best. That may work for a hardware and mobile company like Apple, but it doesn’t work as well for an iterative and communal product like a social network.
I asked Facebook’s public relations department earlier this week if there were any essays, public statements or speeches that reflected his philosophies on privacy. They didn’t respond on that particular question. But from what I can gather from his previous comments, here’s how he is thinking.
He believes that people should have a single identity: “You have one identity,” he emphasized three times in a single interview with David Kirkpatrick in his book, “The Facebook Effect.” “The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly.” He adds: “Having two identities for yourself is an example of a lack of integrity.”
He’s a proponent of ‘radical transparency’ or that transparency will overtake modern life: “The world moving towards more transparency could be the trend driving the most change over the next ten to twenty years,” he said in the book. Another early Facebook engineer Charlie Cheever, who left and went on to start Quora, added: “I feel Mark doesn’t believe in privacy that much, or at least believes in privacy as a stepping-stone. Maybe he’s right, maybe he’s wrong.”
But he is deeply conscious of the fact that most people don’t ascribe to that idea: “To get people to this point where there’s more openness — that’s a big challenge. But I think we’ll do it. I just think it will take time,” Zuckerberg said in the book. “The concept that the world will be better if you share more is something that’s pretty foreign to a lot of people and it runs into all these privacy concerns.”
Another long-time Facebook employee Yishan Wong, who recently left, wrote that the company spends an immense amount of time focusing on how users want to control information on Quora: “My observation of Facebook as a company (its people, including its executives) is that it cares a lot about privacy. It spends a lot of time thinking about it, it spends a lot of time thinking about how to protect its users’ privacy, and then (ironically) it is continually surprised at how the vast majority of its users don’t end up really caring at all to make use of various privacy-protection mechanisms built into the products.
He added, “Mark Zuckerberg probably cares about privacy, but he probably also understands it in a far deeper way than most people do, because he has to work with it in a real and practical sense, and so if he “doesn’t believe in it,” it’s in the way that someone doesn’t “believe in” a primitive and unexamined view of something when he has had to personally develop a fuller and deeper understanding of it.”
The problem is that privacy is not generally understood to be a reciprocal right, according to Wong. People love to invade other people’s privacy, but are upset when the same is done to them. Hence, every time Facebook has become more open and public, its usage has gone up because people suddenly get more access to each other’s information.
He believes that transparency has the power to create more empathy and tolerance: He emphasized the point again and again in speeches abroad to foreign developer communities in Europe.
He’s afraid that if the company doesn’t gradually open up, it will fail: He stressed the perils of building walls in the book. “The best thing we can do is kind of move smoothly with the world around us and to have constant competition, not build walls. To the extent that we think most of the sharing is going to happen outside of Facebook anyway, we really want to encourage it. I can’t guarantee we’ll succeed. I just think that if we don’t do this then eventually we will fail.”
At Startup School last fall he said: “The biggest risk you can take it is to take no risk. In a world that’s moving quickly, you know that if you don’t change you’ll lose. Not taking risk is the riskiest thing you can do. You have to do things that are kind of bold even if they’re not obvious.”
He believes that Facebook’s approach to privacy, in which users explicitly share their interests, is more ethically sound than Google’s data collection approach: Zuckerberg tells Kirkpatrick, “Let me paint the two scenarios for you. They correspond to two companies in the Valley. It’s not completely this extreme, but they are on different sides of the spectrum. On the one hand you have Google, which primarily gets information by tracking stuff that’s going on. They call it crawling. They crawl the web and get information and bring it into their systems. They want to build maps, so they send around vans which literally go and take pictures of your home for their Street View system. And the way they collect and build profiles on people to do advertising is by tracking where you go on the Web, through cookies with DoubleClick and AdSense. That’s how they build a profile about what you’re interested in. Google is a great company, but you can see that taken to a logical extreme that’s a little scary.
On the other hand, we started the company saying there should be another way. If you allow people to share what they want and give them good tools to control what they’re sharing, you can get even more information shared. But think of all the things you share on Facebook that you wouldn’t want to share with everyone, right? You wouldn’t want these things to be crawled or indexed–like pictures from family vacations, your phone number, anything that happens on an intranet inside a company, or any kind of private message or e-mail. So a lot of stuff is getting more and more open, but there’s a lot of stuff that’s not open to everyone.
This is one of the most important problems for the next ten to twenty years. Given that the world is moving toward more sharing of information, making sure that it happens in a bottom-up way, with people inputting the information themselves and having control over how their information interacts with the system, as opposed to a centralized way, through it being tracked in some surveillance system. I think that’s critical for the world. That’s just a really important part of my personality, and what I care about.”
All of this reflects some incredibly thoughtful deliberation. The problem is the public isn’t aware of it. And because of that, users feel like Facebook is making them false promises that it will just take away in a year or two’s time.
If Zuckerberg has a carefully thought-out vision for where privacy is going, he needs to lay it out. Maybe then we’ll be willing to follow where he leads.
[Homepage photo: deneyterrio]
VentureBeat’s VB Insight team is studying marketing analytics...
Chime in here, and we’ll share the results.