Head over to our on-demand library to view sessions from VB Transform 2023. Register Here


Facebook has been under immense scrutiny since it cut Cambridge Analytica (and parent SCL Group) off from its platform in mid-March after word surfaced that it had improperly obtained Facebook user data. Searing criticism and negative publicity from the revelation forced Cambridge Analytica to shut down, and the full consequences for Facebook itself aren’t yet clear.

A panel at the Milken Global Conference last week took a hard look at Facebook’s role in our lives and whether the social networking giant should be held accountable through a social contract. If we entrust Facebook with our personal data, what safeguards will it offer us in the age of Russian bots, trolls, fake news, and target marketing? Willow Bay, dean of the University of Southern California’s Annenberg School for Communication and Journalism, moderated the session.

The panelists included Chris Hughes, Facebook cofounder and co-chair of the Economic Security Project; John Steinberg, founder and CEO of Cheddar; Om Malik, partner at True Ventures​; and Tristan Harris, cofounder and executive director of the Center for Humane Technology.

The panelists looked into how Facebook could be regulated, or how it might even have to pay us for the use of our data in the future. No one on the panel was particularly optimistic that Facebook would make the right decisions on its own, as Facebook CEO Mark Zuckerberg has promised.

Here’s an edited transcript of the session. And you can watch it on video here.

Above: (Left to right) Tristan Harris of Center for Humane Technology, Chris Hughes of Economic Security Project, Willow Bay of the USC Annenberg School for Communication and Journalism, Om Malik of True Ventures, and John Steinberg of Cheddar.

Image Credit: Dean Takahashi

Willow Bay: Welcome to social media and the social contract. I’m a regular here at this conference, and I often find that it functions as a snapshot of sorts. It captures not just the issues and topics of the day, but also the mood. John Steinberg of Cheddar and I were here at last year’s social media panel, which was about fake news. We covered issues of politics and the erosion of truth. Flash forward a year and here we sit.

It feels like a very different time. Fake news almost seems like a kinder, gentler era in social media. That was before we really understood the role bots play in the social media ecosystem, before Cambridge Analytica and the harvesting of data on 83 million users, and before Mark Zuckerberg was invited, or called, before Congress.

To me this moment feels different. I’d like to start by asking this panel the same question. Is it just a great big Facebook problem? Is it a tech backlash? Is it the next phase of the digital revolution? Or is it a moment of cultural reckoning, or something else? Chris, you think it took the Cambridge Analytica scandal to open the door to what’s going on and educate users.

Chris Hughes: I think it’s only beginning. I hope it’s only beginning. We all know the state of play, if you will, after the Cambridge Analytica scandal. 87 million Facebook users’ data was exposed, and in many cases used against them in political advertising. All of a sudden people are asking fundamental questions. How much data do I create? Do I own it? Does Facebook own it? If it’s my photo, is it mine to keep? If you take a photo of me, is that mine? Can I take it with me to other platforms? Where can I go if I want to go elsewhere? Is there any real competition in this space?

I do think this is a watershed moment, a cultural reckoning. I hope that it’s just the beginning. Now that we’re on the other side of Mark’s testimony in front of Congress a couple weeks ago, at least amongst tech folks, there’s a collective exhale. “We got through this phase.” I think that’s profoundly misplaced. Instead, true leadership in this moment should view this as an opportunity to have a big cultural conversation about all this data, about who owns it, about what happens to it, and even bigger questions about what role government should play, and whether we as users of these platforms should be compensated, or have some share of the wealth these platforms create. It’s a watershed moment, but it should be seen as an opportunity.

Bay: Tristan, a little over a year ago you appeared on 60 Minutes and really alerted us, in a very national and visible way, to the behavioral modification machines that are both the tools and the media we consume. First of all, could you briefly share what you said? And what do you think of the reaction since then?

Tristan Harris: My background, I was a Google design ethicist, which meant—if you have a 2-billion-person ant colony called humanity and you put a phone in their pocket, how does it manipulate their psychological biases and get them to do things? So the question is, how do you ethically manipulate 2 billion people’s thoughts?

Back in 2013, I did a presentation at Google about how we had a moral responsibility in shaping people’s attention, their choices, and the relationships they attend to or not. To your point, what people are waking up to is that technology is increasingly the number one political, social, electoral, cultural actor in the world. The more people have a phone in their pocket, to set the table stakes, there’s 2 billion people using Facebook. 1.5 billion people view YouTube. That’s about as many people as follow Islam. Millennials check their phones 150 times a day, from the moment they wake up and turn the alarm off to when they go to bed and turn it on.

We have you from the moment you wake up. Thoughts start streaming into your head that you’re not controlling. The designers of the technology companies really do control what people think. That question becomes, “How do you wake people up to that?”

60 Minutes, a year ago, was opening up the conversation about addiction and how people’s minds are influenced by things they don’t see. Things like Cambridge Analytica and the Russian bots are waking people to the fact that you can sway—it’s a remote control for manipulating an election. Hitler put a radio in every home. Now Putin just needs Facebook in everyone’s hands. We have business model that makes this business as usual. The business model enables those problems, and I hope to talk more about that.

Mark Zuckerberg on stage at F8.

Above: Mark Zuckerberg on stage at F8.

Image Credit: Facebook

Bay: Om, as someone who both covers technology and invests in technology, how would you characterize this moment?

Om Malik: We’re in between the past and the future. For the longest time, we’ve been controlled by the rules and ideas and ideologies of the industrial era, where the world moved at a more human scale. Now we’re going into a world which moves at the speed of the network. Our thoughts are manipulated at the speed of the network. As human beings we’re finding out that there are actors out there. We don’t know what they’re doing. We’re caught between the past and future.

The technology which is creating problems will also come up with solutions, but how do we manage our future? All Facebook is, it’s the most efficient form of a network effect, the most efficient form of behavior modification. It’s the genetically modified tobacco of social manipulation. It’s done a great job of what it was started to do – not in 2004, but post-2008 – to be the most efficient advertising platform in the history of humankind. They have done a great job of that. You can see that in the stock price, in the earnings. The system is working as intended. There’s nothing crazy about that, from a technology standpoint.

From a social and cultural standpoint, in the last six months people have woken up to the idea that this is not good for us. I don’t think people have widely realized just how bad it is. They’ve just realized it’s not good.

Bay: John, through the lens of digital media—you cover business. Facebook is the second-business seller of digital ads, with no signs of that diminishing. Is it still business as usual in the business and advertising community, or are they coming to terms with this as a moment of reckoning?

John Steinberg: I got it wrong. When the Cambridge Analytica thing came out, I thought that—not that advertisers would care about it, but what Facebook did almost immediately afterward was they made the product less effective. They clamped down on a lot of tools. You could no longer use credit-card data, third-party data, to be able to target people. They took away tools we use that allowed us to see which advertising content was being seen by people after the fact. They made the product less effective. They closed a lot of loopholes.

I thought this was going to be bad. Facebook’s product would still be the best advertising product, but it would be 10 percent less good. What I didn’t realize is that their product is so much better than everybody else’s product, and they have the 2 billion people on it, and they have those 2 billion people coming back daily, if not weekly. There was no impact.

Then, the marketers. There is a cluster of very high-profile CMOs in the United States, Fortune 500 companies. They want good PR. They want to be viewed as caring about things that matter in society, but ultimately they care more about selling their products. You’ll continue to see these breast-beatings from marketing giants. “We’ll pull our ads from YouTube. We’ll pull our ads from Facebook. If Facebook doesn’t fix this we’ll have big problems with our marketing budget.” But it’s all spin. It’s impossible for a company in the United States to not market on Facebook.

Bay: Next quarter, when we get results that will reflect this current period of time—

Steinberg: We just had the Facebook quarter, which had some of it, to your point. They’ll continue to be better and better.

Malik: 42 percent year on year growth in their revenue. You can’t go wrong with that. The CMOs are leaning into Facebook and Google, not the other way around. It’s a fallacy to think that Facebook needs the big consumer brands, like P&G. They don’t. They can make small brands big almost overnight. Whether it’s Zynga, whether it’s Spotify, they are the new kingmaker. You’re more beholden to Facebook than ever, and to Google.

Bay: Is this just a Facebook problem, or is it more?

Malik: It’s across the board. It’s a social media problem, a digital media problem. Anyone who says it’s just a Facebook problem should see the scripts running on their own website. All the newspapers are such hypocrites. The New York Times has 21 tracking scripts running on their website. If they were really that that much holier, they wouldn’t be around long.

Bay: Whatever happened to Ghostery? Is that still around? I highly recommend this. It’s a service you sign up for and it shows who’s tracking you on whatever site. It’s a little horrifying, to see that on your feed.

Malik: Just to be clear, traditional media has not been good at this. Facebook and Google are very good at advertising. The old media online, they’re just not as good at doing this – at targeting, at selling ads effectively, at getting more money per ad. The big media companies are still stuck in the past.

Steinberg: I’ll take the other side. I do think it’s just a Facebook problem. Not that I really disagree with anything Om is saying, but the executives at Facebook, the people at Facebook, are so much more arrogant than any company I deal with. I have found that when you sense the tenor of the people at the company, you have a good indication of what’s going on at the company.

It’s so clear that it’s all spin. Even after the testimony, even after the Cambridge Analytica thing, on all their internal message boards, as reported by the New York Times, they were basically saying, “This is so unfair. Why is everyone coming after us?” All these things. That lack of perceptiveness makes them far more dangerous.

Bay: Chris, you were a founder of Facebook. You haven’t worked there in a long time, but you also referenced this, when you talked about the collective exhale. Do you agree with John?

Hughes: I think the problem is much bigger than Facebook, but I also agree with John. There is a culture at Facebook, at Google, at the biggest companies that’s a result of the concentration of power they have. If what Om and John are saying is true, that Procter and Gamble and other huge consumer goods companies don’t have anywhere else to go, that should make anyone who believes in the free market and the virtue of competition very uncomfortable. That’s essentially saying that so much power has been concentrated at Facebook and Google that the biggest, most talented advertisers and marketers have next to no leverage. That should be scary to anyone who cares about that competition.

I do, though, think that we need to be very specific about what problems we’re talking about. I put those problems in three buckets. The first is data privacy and protection. That’s Cambridge Analytica and the larger question around who owns my data. Is it my property? Is it my labor? How do we think about that? The second category is around democracy and the news. How do we make sure we don’t build networks that only reward the most extreme voices? Let alone make sure that foreign powers aren’t able to hack election. And then there’s the third, which is what Tristan was talking about, the conversation around attention and the way that these companies, through our apps and our devices, very much decide where to direct our attention, and increasingly own more and more of it.

These problems are each so big, it’s hard to talk about all of them. They’ve just been boiling under the surface for more than 15 years now, and they’re all spilling out at once. That’s why I think this is such a critical moment to dig into this is.

Bay: Let’s talk a bit about what makes up the social contract in the digital age, what that looks like in mobile and social media. Tristan, could you share with us some of your thinking about how we need to re-orient or redefine that contract?

Mark Zuckerberg testifies in front of the Senate Judiciary and Commerce Committees.

Above: Mark Zuckerberg testifies in front of the Senate Judiciary and Commerce Committees.

Harris: Partially, as Chris said—in Silicon Valley, all of these products emerged out of a libertarian philosophy. We create a product. If people like it they use it. If they don’t like it they won’t. If you don’t like the 2-billion-person social network on Facebook, just switch to another one. But of course, I just came back from the University of Chicago and their conference on anti-trust. The network effects that these companies have created have made them virtually impenetrable. The traditional sources of—who owns the dollars and can you redirect them, through shareholder activism or customer activism—even the customers, the advertisers, don’t have a lot of options.

The question, then, is what kind of relationship do we really have with these platforms? Right now it’s just, sign on the dotted line, get consent, and they can change their policies at any time. Theoretically they should notify you, but they don’t. They can do what they want. They can choose new business models. You, the user, are responsible, because you hit the OK button.

What I’d love to introduce to you—there’s a different kind of relationship that describes these products, and it’s a fiduciary one. If you think about the asymmetric power an attorney has over a client, they know way more about the law. They can manipulate and exploit their client. They have lots of privileged information about the client. If they want to, they can screw them over. The asymmetry of power is enormous. Same with psychiatrists or priests in a confessional.

If you stack that up, how much asymmetric power an attorney has over the client, how much asymmetric power a psychiatrist has over the intimate details and private thoughts of their patient—now, next to that, add to that how much power around the intimate details of your mind and your communication, and even what you don’t know about how your brain works—how much of that does Facebook have? On the grounds of that asymmetric power alone, we should reclassify Facebook as having a fiduciary responsibility or a fiduciary relationship.

That instantly changes other things. It instantly makes clear why Facebook could never be an advertising-based business model. Imagine a psychotherapist who knew every detail of your life, and also listened to every one of your conversations, and everyone’s conversations with each other, and his entire business model was to sell access to that information to someone else. Or a priest in a confessional whose entire business model, the only way he makes money, is selling access to everything he’s learned in that confessional to someone else. When you frame it that way, it’s clear that this is a dangerous business model.

When I was a kid I was a magician. Throughout my background I’ve always had this sensitivity to the fact that people’s minds can be manipulated. Instead of seeing choices as an authoritative thing, I see human beings as living inside 24/7 magic tricks built out of the cognitive biases in their minds. And then in college at Stanford I was part of something called the Spatial Technology Lab, which teaches engineering students how to manipulate people’s psychology and get them to engage with products. Some of my friends in those classes were the founders of Instagram.

They taught that if you want people to use products, you turn them into slot machines. You give them the juicy rewards. You give them out some times and hold some back some other times. You make it addictive. The human animal is very easily manipulated. From that perspective, you have 2 billion people in one environment with a business model that manipulates all their deepest vulnerabilities. Including, and I think this is one thing we don’t talk about enough—on top of all this you add AI.

If you check out an article on the Intercept, there’s something called FBLearner Flow, where Facebook can predict what you’re going to be vulnerable to in the future. They can predict when you’ll have low self-esteem. They can predict when you’re about to change your opinions on certain topics. That psychotherapist isn’t just listening to you and your conversations, but to 2 billion people’s conversations. We’ve never had an AI that could learn from 2 billion minds, including what color of buttons light them up.

When you think about it that way, this is a dangerous situation, to have all that power completely unaccountable to the public interest, to democracy, to truth. They can claim that they care about users, but they only care about them insofar as they need them to be jacked into this environment they’ve created.

Bay: Now that you’ve terrified us, let me turn to Chris. I’ve noticed you smiling, and I couldn’t tell whether that was appreciation or disagreement, so feel free to share. Also, you come at this from a different contractual perspective, in the contract between our data and companies that have access to it.

Hughes: Well, first, I agree with everything Tristan just said. His analysis—I give him credit for seeing this world and the direction it’s been heading in well before just about anyone else. I was nodding in general agreement.

The key thing that stands out to me, that you can’t overstate, is how much the design choices matter and encourage people into behaviors that may feel good in the short term, but are often an illusion for the long term. When it comes to data and ownership, this is what I’m thinking about the most these days, because this is where it’s not just a Facebook or a Google problem.

Your phones know where you’ve been geographically at every moment of the day. If you have one of those Nest thermostats that helps you be energy-efficient, it knows the temperature in your home. Your Alexa listens to everything you say, not to mention all your email. The amount of data we create is enormous, and in many cases that’s very good. Big data analytics—the analysis of Tesla driving patterns means future cars might be safer. But the issue is, all these people, all of us, create all this data, and we just hit “Agree.” We give up all legal rights. We get no compensation for that.

What’s happening now is historic profit. The margins of Facebook and Google are through the roof. The CEOs just say, “Well, you’re using our service for free.” In reality, all of that data is not only valuable now, but with the coming of artificial intelligence, it’ll be even more valuable in the future. That’s why more people should be talking about some kind of data dividend, some kind of sovereign wealth fund that’s capitalized from companies that make enormous historic profits off consumer data. Maybe it’s a five percent royalty on the revenues that go into that fund. It cuts a check to each American to make sure everyone shares in the upside of this a bit.

There’s precedent for this up in Alaska with oil. Every Alaskan gets a check paid for by something that was their common resource. Data is the common wealth of the future century that we’re all creating. We should tie our economic outcomes to it, so that it’s not just a few people who are getting extremely lucky, like myself and many others in the tech world.

Above: John Steinberg of Cheddar at the Milken Global Conference.

Image Credit: Dean Takahashi

Malik: The way I think about the problem we have right now, it’s very much like the tobacco problem. Tobacco has been around forever, but it wasn’t as addictive until Philip Morris and others modified it and everything changed. Behavior modification in media has always been around, but Facebook took it to the next level.

In the tobacco industry, just putting that label on boxes was a big step forward. We need to have similar approaches to data and privacy. Instead of terms and services, companies big and small should be forced to write terms of trust. What will they not do with our data? Rather than saying what they will do with our data. That’s not particularly definable in technology terms. But when you look at what Facebook is paid to do, it should be to protect our data from leaking through some third-party app. That’s their job as a platform. There need to be terms of trust that lay that out.

The other thing we need to do is figure out a non-industry group that regulates data. I don’t think people from the technology industry should be allowed to do this. We need the people and their representatives coming up with rules and regulations around how data needs to be protected.

Bay: Give me three things you want to see in those terms of trust.

Malik: Number one is that the user’s data is not leaked to a third party, that it’s protected. How about that for a starter? My data isn’t going to get stolen. Number two, it won’t be sold to third parties. Facebook is giving me value and they expect to make money in return, but they can’t make money selling me to analyzers, to the likes of Cambridge Analytica.

Having been in technology for most of two decades, there was a time when Microsoft looked unbeatable. Sun Microsystems looked unbeatable. They all come and they all go. We’re in the fifth generation now. I’m assuming there is something else out there.

Bay: It’s interesting that big bad old media suddenly looks valorous in all of this. But in all seriousness, John, are there codes of ethics, codes of conduct from the old media business that are applicable here?

Steinberg: I’m glad you asked that question, because what I wanted to comment on is where I believe the social contract exists between media and the distributors of media. Imagine how bad the relationship has gotten between Facebook and the media, that somebody like Andy Lack, who’s hardly a bomb-thrower, called Facebook “Fakebook” in public. The relationship has gotten to the point that it’s worthless, if he’s willing to say something like that. My old CEO Jonah Peretti is now making comments about Facebook not living up to its bargain and compensating publishers. Rupert Murdoch, I guess he says whatever he wants to say, but he’s doing full-page letters now saying Facebook has to pay publishers money.

Facebook is a lot like Donald Trump in that the first time Trump does something crazy, everyone is shocked. Then, over time, it becomes normalized. Human beings can adapt to basically anything. If Comcast or Charter continually said to media companies, “Today you’re here, but tomorrow your channel disappears, sorry about that,” constantly moved people around the dial, some days you get access to audience data and other days you don’t—people would be up in arms. It would be intolerable. But what Facebook did is, they changed up so often and were so routinely distrustful that people just became accustomed to it. There was nothing you could say anymore.

The only bright side to it is, you see someone like Tim Cook on MSNBC saying, “We never would have gotten ourselves into a situation like this.” Historically, Apple has always treated media companies far better. There’s been a set of rules. You signed a contract. You got on the system. The App Store didn’t change radically overnight and make your app suddenly disappear.

The traditional cable companies have typically behaved with much more negotiation and trust and agreement. When I look at our relationship now with Hulu and YouTube and Sling, we did a contract. We negotiated. We’re on the system. There are rules around what we put on the system. There are rules around how they compensate us and how the ad split works. There’s not only a social contract, but an actual contract in place.

Facebook has no friends left now. No one is rooting for their success. Every media company wants them to fail. That’s five, 10 years of bad behavior.

Malik: I have no sympathy for the media at all. This is a mess of their own making. They’ve shot themselves in the foot. I was the first person who went to my editors and said, “We should start a website.” They said no, and then they fired me. Big media, let them rot in hell. That’s all. [laughter]

But the problem we’re talking about here is Facebook. How do we think about the next chapter of Facebook? That’s more important. What’s happened in the past isn’t going to change. How are we going to control the beast?

Bay: Let’s talk about that. Where do we go from here? Is regulation in the U.S. inevitable?

Above: Willow Bay of USC and Om Malik of True Ventures.

Image Credit: Dean Takahashi

Steinberg: I don’t think anything is inevitable, politically, in this climate.

Harris: There are different things happening. The Honest Ads bill may pass. But obviously everyone knows that the U.S. regulatory climate is not very functional at the moment. The GDPR is about to go into effect in Europe, if you weren’t aware of that. That’s coming up in May. That’s going to set data rules. But now the U.S. is comparatively unprotected in terms of privacy. That’s one thing to consider. I know Klobuchar and Kennedy and Blumenthal are putting up a bill that’s essentially universal data protection for U.S. citizens, hopefully mapping to the protection in Europe.

We need to examine much deeper questions, though, around what it means to have something so powerful. How do you make it accountable to something other than its own profit? Only because Facebook is affecting 2 billion people’s minds—we haven’t even gotten to the issues around, it’s a machine designed to throw thoughts into people’s minds based on whatever thoughts got clicked and liked the most. In languages the engineers don’t even speak – in Sri Lanka, in Burma – you have genocides being amplified by the fact that these are countries that came online only in the last two or three years. You have automated systems pushing ideas in people’s minds that literally cause them to kill each other. The U.N. has called out Facebook, in the case of Burma, as one of the principal amplifiers of that conflict.

I say this in a sense of, when the New York Times asks nine experts, “What would you do fix Facebook”—Tim Wu, who wrote a great book called The Attention Merchants about this, said, “I would turn it into a global public benefit corporation.” It might sound incredibly naïve to say something like that, but I don’t think anything with so much power should be anything but that. There’s a question of whether you take the existing system there, or you constrain the existing system and make way for new completion. Both of those are important to consider.

Malik: I think it’s a naïve way of thinking about the future, the idea of making it a non-profit, a global benefit corporation. Where we need to be thinking is not about the past, about what they’ve been able to build, but putting more regulation around facial recognition, visual data they’re collecting, video data they’re collecting. AI can create much more effective fake personas. That can have much more damaging impact on society. That’s the baseline we need to start with. What’s happened in the past is probably difficult to monitor right now. We should have very hard rules established around collecting visual data.

Bay: Hard rules established by who? Do we use the same levers we’ve traditionally used – government regulation, industry self-policing, consumers voting with their wallets, advertisers voting with their wallets? All of the above.

Malik: Yes, plus whatever Chris suggests.

Hughes: Plus the data dividend?

Bay: Yeah, a data protection agency, or a data dividend.

Hughes: Well, I think both. I think we can have a data protection agency modeled on the Consumer Financial Protection Bureau, or if you don’t like that because you’re on the right, you can choose other regulatory agencies. There is, even in a period when there’s a lot of distrust of government—unfortunately, we have to have public policy. That’s the role of public policy, to stand up for citizens who need that protection.

We live in a time where, whenever you start talking about public policy, people just tune out. “We’ll never get anything done.” I share that cynicism, but we have to beat back against it. This is a perfect example of a place where we can make headway.

I wanted to comment on the three things you just outlined. Consumers voting with their wallets is not feasible in this moment, in the same way it might be in others. When you think about competition among these platforms, Facebook owns Messenger, Instagram, and WhatsApp. By some counts that’s 80 percent of the social traffic on the web, all going through Facebook services. This idea of “delete Facebook” as a movement—there aren’t viable alternatives out there.

I can’t leave Gmail, and I don’t think it’s right to ask me to leave, with all the services that are locked in there. By the way, Google knows a lot more about me than Facebook does. We need to have a more nuanced view of what consumer power is here. Some people are calling for data agents, which would enable people to choose an agent, almost like a union, to lobby on people’s behalf. It may seem like a naïve idea, but there are some very smart people thinking ahead about how this might work in companies that are starting up. I don’t know what direction this is going to take, but we have to change the power dynamics in this landscape.

Steinberg: Why is it that all the smart, decent people used to work at Facebook, and are now on the outside? I mean this with all respect. You, Sean Parker, Roger McNamee, the 20 other people on the outside—doesn’t Mark say, “Hey, all the good people with good ideas and decency left. Sheryl’s still in hiding. What’s going on?”

Hughes: This is where we might disagree a little bit. I don’t think the leadership there is motivated by any malice. I have not seen the side of them that’s as brusque or—there are certainly lots of pockets in the company of people who have very different views than I do, so I’m not setting up to defend them. I do think, though, that it’s too easy to say that they’re all living in a different world. They’re reading the same news and having the same conversations we’re having. But the public pressure is only now just beginning. A year ago Mark Zuckerberg was running for president. There was certainly not this tone of close criticism of the company.

Again, not just at Facebook, but across the board at these companies, they’re very aware they’re under the microscope now. I think that’s a good thing for them and for democracy in the long term. They’ve been unaccountable for so long.

Malik: A bigger problem in Silicon Valley that no one wants to ouch is that we all have blood on our hands. Every single person in the Valley has become beholden to an idea of unfettered endless growth at any cost. It’s not that people are bad. The whole incentive structure is based on growing fast and making a lot of money.

Think about Uber, which is just an idea in 2008. A few years later it’s a $50 billion company. That kind of growth cannot happen without taking some shortcuts. YouTube became this massive platform by infringing on IP. It’s not just Facebook. If there is a cultural moment right now, it’s time for Silicon Valley to take a step back and say, “We’re chasing unfettered growth. But behind every data point there is a person.” As long as we can internalize that as an industry, we’ll make better decisions.

This has to be across the board. Not just at Facebook or at Google. Investors, journalists, entrepreneurs, everyone has to be asking these questions. Are we doing the right thing? How are we protecting people’s data and privacy? How are we protecting the future? Facebook is the most visible platform, but there are others. People don’t talk about that.

Above: (Left to right) Tristan Harris of Center for Humane Technology, Chris Hughes of Economic Security Project, and Willow Bay of the USC Annenberg School for Communication and Journalism.

Image Credit: Dean Takahashi

Bay: If there is a call to action here, if we’re going to take this as a serious moment of cultural reckoning, what is the call to action? For the business community, for the tech community, for consumers, for policymakers?

Harris: I will say that public pressure is working. A year ago, would you have believed that Mark Zuckerberg would be testifying before Congress and talking about regulating social media? We were nowhere near that conversation. Public pressure may seem naïve, but it’s having an impact. The Delete Uber campaign, as an example, that’s not going to change their revenue or meaningfully drop their user numbers, but it does change the culture, especially for the employees.

I know there are many employees—Jan Koum, the founder of WhatsApp, just left Facebook. Many more people are in that same position. Imagine we don’t have any anti-trust law, or any advertisers who are willing to pull out their money. The one thing this company is built on is people. If those people don’t feel good about the practices or the business model or the responses of the company, they’re going to leave. That’s what happened at Uber, what forced Travis to leave and what created that cultural change. There’s a model for this that’s working, and we’ll see more of it.

Malik: In Uber’s case, the middle management wanted change. At Facebook the middle management wants no change. Jan Koum left, but he was not a regular employee. He was on the board. He started a company that was sold to Facebook. He’s an exceptional case. Just one person publicly quit over Facebook’s policies. There’s virtually no one there that thinks what they’re doing is wrong. No one is leaving a cushy million-dollar-a-year job and a Club Med lifestyle over this.

Steinberg: It goes back to what Chris was saying. Chris has been involved in public service, and you can have apathy about public service, or you can get involved. Watching that hearing, they’re so old. They are so very old. There’s nothing wrong with being old, but we can’t have every member of Congress at 75 years of age and older. The world changes fast. You need diversity throughout all organizations and cultures, and part of diversity is age diversity.

Whichever senator is asking Mark Zuckerberg, “If you don’t charge subscription fees, how do you make money?” And he has to answer, without offending him, “Senator, we sell ads.” That senator was either too confused or couldn’t be bothered to read an article. We need regulation. Maybe we need to make bad regulations and fix it to make better regulations, but we need to try. We need to have a political climate in this country where good people can run for office.

Bay: You’re saying, good people who are tech-savvy, who understand these issues that are front and center.

Steinberg: The reason why I talk about age diversity is I think you could have put almost any 25-year-old in that room and they would have asked more informed questions. But not a person who asked questions is even on Facebook. That’s scary.

Audience: I love the idea of the data dividend. What would drive that to actually happen, if what John is saying is true, that Facebook has so much power?

Harris: In the long term it’ll have to be public policy. In the short term, you could imagine creating a cultural norm where companies agree to participate in a dividend. The numbers, though—there are a lot of Americans using the service, so the numbers don’t add up unless you go to not just Facebook and Google, but go to a broader set of consumers, and talk about a meaningful amount of money. Something like 3-5 percent royalties on revenue.

Again, the Alaska model shows that not only is it possible to do that–the royalties up there are several times larger than that, but it has very powerful effects. It’s that classic win-win that our society is so obsessed with. In that scenario, companies would be able to continue innovating. I think that would have to happen with that privacy and protection in tandem, while consumers get their cut.

The other thing that’s critical to mention—people talk about income inequality these days and the growing wealth of the one percent. So much of that is because returns on capital over time have been so pronounced. This is a way of giving everybody a share. It’s feasible to do, no less so than cutting checks from Social Security. It’s not complex. It’s just whether we can develop the political will to get it done.

Bay: Also, though, with that model, where it compensates us for our data, it doesn’t push us to rethink our relationship with this technology.

Harris: That’s why I think it has to happen in tandem. A sense of shared ownership of data is piece one, and with that will come better regulation. Also with that should come recognition of the wealth that’s created.

Hughes: There’s also some stuff happening at the state level in California. I work with a group called Common Sense Media, which does advocacy around how children are affected by these things, and there’s a bunch of stuff moving there. There are some big opportunities around privacy legislation. We have a lot of contacts with Congress members, and the staffers for major Congress members are emailing us and saying, “We see what you’re doing in California and that can serve as a model for us.” There’s actually a group called TechCongress which is educating Congress members about tech policy. They just don’t make it all the way to the top all the time.

I just want to give you guys hope that stuff is moving. There’s a bill in California, the bots bill, to address the fact that we have automation and AI pointed at people’s brains, whether it’s through deep fix or addiction. It’s a serious issue when you have a supercomputer that knows more about how your mind works than you do. There’s some progress. If you prove in California that it’s possible, you can get some more momentum.

Facebook CEO Mark Zuckerberg talks about the departure of

Above: Facebook CEO Mark Zuckerberg talks about the departure of cofounder Jan Koum at F8 annual developer conference held May 1-2, 2018 at the McEnery Convention Center in San Jose, California

Image Credit: Khari Johnson / VentureBeat

Bay: What about the bots bill? Is that something you think we should get behind?

Hughes: It’s super early. I think it’s going to need fleshing out to get it right.

Audience: Part of the problem with this issue is obviously its global nature. You have 270 million Facebook users in India. You have 240 million Facebook users in the U.S. You have more Facebook users in Indonesia and Brazil combined than you do in the U.S. What’s their perspective on how we can tackle this issue and create a social contract in nations without the same democratic traditions at the U.S., but where there’s an even greater need for focus on this issue?

Harris: This is what keeps me up at night. I was on a panel with someone who’s head of policy for Facebook in Europe, talking about these issues of democracy and fake news. One by one, all these representatives from places like Sri Lanka and the Philippines, these countries where Facebook is creating cultural damage—they were kind of raising their hands meekly and saying, “Could you please take a look at the fake news in my country? Because it’s even more meaningful than it is over here.”

The problem is, how big of a budget of control are you going to need to deal with this issue? The fundamental thing is we’ve opened Pandora’s box. There are 2 billion people plugged in. Think about 2 billion live TVs. There’s a five-second delay on live TV for a reason. You have to be careful if someone says something. But now we have 2 billion live TVs. The amount of moderation or values or ethical discernment, all those processes and conversations you have to have, you can’t do that in all these different countries. Facebook probably doesn’t have engineers who even speak the language from many of these countries where it’s active.

What we need to start establishing is, what would that security budget and police force be? Five percent per country? Whatever enormous amount of staff they have to allocate in the short term. But in the long term it’s automation. These systems are so automated that it’s very hard to control.

Malik: A more important way to think about this is, how do you graft social and cultural genes into Facebook’s work? Any product or company like Facebook calls its customers “users.” That alone is the biggest problem. They should start by referring to them as citizens, more than anything else. Then they can take the step to having local bodies educating the mothership about how things work locally.

I remember being a reporter in India. Every time I read a story in the New York Times about India, I would think, “Are we living in the same country? How is this even possible?” That gap is more magnified on social media platforms. Viewing from afar, you can say, “These are the number of users we have in Brazil.” No, this is the number of people we have in Brazil. We have to think about them as people, not as data points. That cultural change needs to happen way before any rules come into play. That’s the most important thing they can do: establish what are almost embassies of learning and understanding local culture. That’s not a huge cost issue for a company.

Audience: The question I ask myself is, do I really need this? In the social engineering aspect, are we so socially engineered now that we are not going backward? We’re not going to a place where I get a flip phone and don’t text anybody?

Steinberg: I think that it’s entirely possible that we’ll turn around five or 10 years from now and say that this was all a big stupid waste of time. Facebook was a big stupid waste of time. Instagram was a big stupid waste of time. We’re just doing different things. We have better software so we’re not inundated with stupid emails all day. We spend a lot less time doing this.

There was a great article written in the New York Times where somebody explained—I can’t remember if this was the 1920s or the 1930s, but heroin dens were unbelievably widespread in New York City. An enormous number of people would go and do heroin all night long, basically. It’s possible we’ll look back on this as just a terrible time in human history, those days when we all spent tons of time staring at our phones.

Harris: I think that’s unlikely in the short term. But the history of technologies is—you start with the printed advertisement, and Tim Wu’s book traces that through radio and television. More and more time and more and more attention is owned. But we don’t see everyone stop using television. People still listen to the radio. Social media just gets added on.

I get more worried, then, that this is just added on, and the next frontier – augmented reality, or whatever else you’re most interested in – just gets added on too. I’m skeptical. Personally, I feel very addicted to my device. I go out of my way to try not to check email or Slack every hour, or on the weekends just put the phone away. If I’m in a room without a phone for more than an hour, I’m anxious, and I know why. I’m the type of person who, I like to think, is pretty aware of all of these things, aware of the behavioral effects, and yet I still suffer from it. I don’t know what to do for myself, let alone my kid.

Above: (Left to right) Willow Bay of the USC Annenberg School for Communication and Journalism, Om Malik of True Ventures, and John Steinberg of Cheddar.

Image Credit: Dean Takahashi

Steinberg: It’s cigarettes. I’ve used the analogy something like five times. Everything you’ve said there sounds like someone who needs a cigarette. I feel the same thing. A whole other panel is on children. For those of us that have young children, this is terrifying.

Bay: Wait, though. Millions of Americans have quit their nicotine habit. There must be a path. You don’t need to sound as defeatist as I’m reading it, do you?

Steinberg: I think it’s different than nicotine, though. It is often socially useful. This is the slot machine metaphor. Sometimes I really need to read my email.

Bay: But there’s a path between cold turkey and—why isn’t there a path toward a saner existence around social media?

Harris: The simplest thing, by the way, is turning off all the notifications on your phone, so it’s not always thudding against your skin. The second thing is setting your phone to grayscale. Just the colors on your screen can activate that banana-like reward for the chimp inside you. If you turn that off, it actually makes a really big difference.

We started the Center for Humane Technology with the premise that you have to redesign the entire technology stack from the ground up with a very vulnerable and manipulatable model of a human being. Take a magician’s view of the human mind. How can we be manipulated at every single level by a phone? It’s not just that it’s addictive in a generic way. It’s that the addictive stuff is right next to the stuff I have to do. I have to know if I’m late to a meeting and check my email, but that’s right next to Facebook, and I’m getting a ding. I’m getting that social approval.

You have to think of this like a badly designed urban plan. You wake up one day, and after 20 years of no zoning and no building codes, suddenly you’re in this casino environment. Everything’s insane. Nobody believes anything is true. Everyone’s addicted to slot machines. That’s the city you enter when you look at your phone. The answer isn’t to throw away the city. The answer is to have Jane Jacobs or someone like that ask, what makes a city livable for people?

You can ask that question in a humane way and not just design it, but also have business models that are non-extractive, not based on manipulation, and also humane policy. But you need to think about it from the ground up.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.