Did you miss a session from the Future of Work Summit? Head over to our Future of Work Summit on-demand library to stream.
Facebook has been under immense scrutiny since it cut Cambridge Analytica (and parent SCL Group) off from its platform in mid-March after word surfaced that it had improperly obtained Facebook user data. Searing criticism and negative publicity from the revelation forced Cambridge Analytica to shut down, and the full consequences for Facebook itself aren’t yet clear.
A panel at the Milken Global Conference last week took a hard look at Facebook’s role in our lives and whether the social networking giant should be held accountable through a social contract. If we entrust Facebook with our personal data, what safeguards will it offer us in the age of Russian bots, trolls, fake news, and target marketing? Willow Bay, dean of the University of Southern California’s Annenberg School for Communication and Journalism, moderated the session.
The panelists included Chris Hughes, Facebook cofounder and co-chair of the Economic Security Project; John Steinberg, founder and CEO of Cheddar; Om Malik, partner at True Ventures; and Tristan Harris, cofounder and executive director of the Center for Humane Technology.
The panelists looked into how Facebook could be regulated, or how it might even have to pay us for the use of our data in the future. No one on the panel was particularly optimistic that Facebook would make the right decisions on its own, as Facebook CEO Mark Zuckerberg has promised.
Here’s an edited transcript of the session. And you can watch it on video here.
Willow Bay: Welcome to social media and the social contract. I’m a regular here at this conference, and I often find that it functions as a snapshot of sorts. It captures not just the issues and topics of the day, but also the mood. John Steinberg of Cheddar and I were here at last year’s social media panel, which was about fake news. We covered issues of politics and the erosion of truth. Flash forward a year and here we sit.
It feels like a very different time. Fake news almost seems like a kinder, gentler era in social media. That was before we really understood the role bots play in the social media ecosystem, before Cambridge Analytica and the harvesting of data on 83 million users, and before Mark Zuckerberg was invited, or called, before Congress.
To me this moment feels different. I’d like to start by asking this panel the same question. Is it just a great big Facebook problem? Is it a tech backlash? Is it the next phase of the digital revolution? Or is it a moment of cultural reckoning, or something else? Chris, you think it took the Cambridge Analytica scandal to open the door to what’s going on and educate users.
Chris Hughes: I think it’s only beginning. I hope it’s only beginning. We all know the state of play, if you will, after the Cambridge Analytica scandal. 87 million Facebook users’ data was exposed, and in many cases used against them in political advertising. All of a sudden people are asking fundamental questions. How much data do I create? Do I own it? Does Facebook own it? If it’s my photo, is it mine to keep? If you take a photo of me, is that mine? Can I take it with me to other platforms? Where can I go if I want to go elsewhere? Is there any real competition in this space?
I do think this is a watershed moment, a cultural reckoning. I hope that it’s just the beginning. Now that we’re on the other side of Mark’s testimony in front of Congress a couple weeks ago, at least amongst tech folks, there’s a collective exhale. “We got through this phase.” I think that’s profoundly misplaced. Instead, true leadership in this moment should view this as an opportunity to have a big cultural conversation about all this data, about who owns it, about what happens to it, and even bigger questions about what role government should play, and whether we as users of these platforms should be compensated, or have some share of the wealth these platforms create. It’s a watershed moment, but it should be seen as an opportunity.
Bay: Tristan, a little over a year ago you appeared on 60 Minutes and really alerted us, in a very national and visible way, to the behavioral modification machines that are both the tools and the media we consume. First of all, could you briefly share what you said? And what do you think of the reaction since then?
Tristan Harris: My background, I was a Google design ethicist, which meant—if you have a 2-billion-person ant colony called humanity and you put a phone in their pocket, how does it manipulate their psychological biases and get them to do things? So the question is, how do you ethically manipulate 2 billion people’s thoughts?
Back in 2013, I did a presentation at Google about how we had a moral responsibility in shaping people’s attention, their choices, and the relationships they attend to or not. To your point, what people are waking up to is that technology is increasingly the number one political, social, electoral, cultural actor in the world. The more people have a phone in their pocket, to set the table stakes, there’s 2 billion people using Facebook. 1.5 billion people view YouTube. That’s about as many people as follow Islam. Millennials check their phones 150 times a day, from the moment they wake up and turn the alarm off to when they go to bed and turn it on.
We have you from the moment you wake up. Thoughts start streaming into your head that you’re not controlling. The designers of the technology companies really do control what people think. That question becomes, “How do you wake people up to that?”
60 Minutes, a year ago, was opening up the conversation about addiction and how people’s minds are influenced by things they don’t see. Things like Cambridge Analytica and the Russian bots are waking people to the fact that you can sway—it’s a remote control for manipulating an election. Hitler put a radio in every home. Now Putin just needs Facebook in everyone’s hands. We have business model that makes this business as usual. The business model enables those problems, and I hope to talk more about that.
Bay: Om, as someone who both covers technology and invests in technology, how would you characterize this moment?
Om Malik: We’re in between the past and the future. For the longest time, we’ve been controlled by the rules and ideas and ideologies of the industrial era, where the world moved at a more human scale. Now we’re going into a world which moves at the speed of the network. Our thoughts are manipulated at the speed of the network. As human beings we’re finding out that there are actors out there. We don’t know what they’re doing. We’re caught between the past and future.
The technology which is creating problems will also come up with solutions, but how do we manage our future? All Facebook is, it’s the most efficient form of a network effect, the most efficient form of behavior modification. It’s the genetically modified tobacco of social manipulation. It’s done a great job of what it was started to do – not in 2004, but post-2008 – to be the most efficient advertising platform in the history of humankind. They have done a great job of that. You can see that in the stock price, in the earnings. The system is working as intended. There’s nothing crazy about that, from a technology standpoint.
From a social and cultural standpoint, in the last six months people have woken up to the idea that this is not good for us. I don’t think people have widely realized just how bad it is. They’ve just realized it’s not good.
Bay: John, through the lens of digital media—you cover business. Facebook is the second-business seller of digital ads, with no signs of that diminishing. Is it still business as usual in the business and advertising community, or are they coming to terms with this as a moment of reckoning?
John Steinberg: I got it wrong. When the Cambridge Analytica thing came out, I thought that—not that advertisers would care about it, but what Facebook did almost immediately afterward was they made the product less effective. They clamped down on a lot of tools. You could no longer use credit-card data, third-party data, to be able to target people. They took away tools we use that allowed us to see which advertising content was being seen by people after the fact. They made the product less effective. They closed a lot of loopholes.
I thought this was going to be bad. Facebook’s product would still be the best advertising product, but it would be 10 percent less good. What I didn’t realize is that their product is so much better than everybody else’s product, and they have the 2 billion people on it, and they have those 2 billion people coming back daily, if not weekly. There was no impact.
Then, the marketers. There is a cluster of very high-profile CMOs in the United States, Fortune 500 companies. They want good PR. They want to be viewed as caring about things that matter in society, but ultimately they care more about selling their products. You’ll continue to see these breast-beatings from marketing giants. “We’ll pull our ads from YouTube. We’ll pull our ads from Facebook. If Facebook doesn’t fix this we’ll have big problems with our marketing budget.” But it’s all spin. It’s impossible for a company in the United States to not market on Facebook.
Bay: Next quarter, when we get results that will reflect this current period of time—
Steinberg: We just had the Facebook quarter, which had some of it, to your point. They’ll continue to be better and better.
Malik: 42 percent year on year growth in their revenue. You can’t go wrong with that. The CMOs are leaning into Facebook and Google, not the other way around. It’s a fallacy to think that Facebook needs the big consumer brands, like P&G. They don’t. They can make small brands big almost overnight. Whether it’s Zynga, whether it’s Spotify, they are the new kingmaker. You’re more beholden to Facebook than ever, and to Google.