Steinberg: The reason why I talk about age diversity is I think you could have put almost any 25-year-old in that room and they would have asked more informed questions. But not a person who asked questions is even on Facebook. That’s scary.

Audience: I love the idea of the data dividend. What would drive that to actually happen, if what John is saying is true, that Facebook has so much power?

Harris: In the long term it’ll have to be public policy. In the short term, you could imagine creating a cultural norm where companies agree to participate in a dividend. The numbers, though—there are a lot of Americans using the service, so the numbers don’t add up unless you go to not just Facebook and Google, but go to a broader set of consumers, and talk about a meaningful amount of money. Something like 3-5 percent royalties on revenue.

Again, the Alaska model shows that not only is it possible to do that–the royalties up there are several times larger than that, but it has very powerful effects. It’s that classic win-win that our society is so obsessed with. In that scenario, companies would be able to continue innovating. I think that would have to happen with that privacy and protection in tandem, while consumers get their cut.

The other thing that’s critical to mention—people talk about income inequality these days and the growing wealth of the one percent. So much of that is because returns on capital over time have been so pronounced. This is a way of giving everybody a share. It’s feasible to do, no less so than cutting checks from Social Security. It’s not complex. It’s just whether we can develop the political will to get it done.

Bay: Also, though, with that model, where it compensates us for our data, it doesn’t push us to rethink our relationship with this technology.

Harris: That’s why I think it has to happen in tandem. A sense of shared ownership of data is piece one, and with that will come better regulation. Also with that should come recognition of the wealth that’s created.

Hughes: There’s also some stuff happening at the state level in California. I work with a group called Common Sense Media, which does advocacy around how children are affected by these things, and there’s a bunch of stuff moving there. There are some big opportunities around privacy legislation. We have a lot of contacts with Congress members, and the staffers for major Congress members are emailing us and saying, “We see what you’re doing in California and that can serve as a model for us.” There’s actually a group called TechCongress which is educating Congress members about tech policy. They just don’t make it all the way to the top all the time.

I just want to give you guys hope that stuff is moving. There’s a bill in California, the bots bill, to address the fact that we have automation and AI pointed at people’s brains, whether it’s through deep fix or addiction. It’s a serious issue when you have a supercomputer that knows more about how your mind works than you do. There’s some progress. If you prove in California that it’s possible, you can get some more momentum.

Facebook CEO Mark Zuckerberg talks about the departure of

Above: Facebook CEO Mark Zuckerberg talks about the departure of cofounder Jan Koum at F8 annual developer conference held May 1-2, 2018 at the McEnery Convention Center in San Jose, California

Image Credit: Khari Johnson / VentureBeat

Bay: What about the bots bill? Is that something you think we should get behind?

Hughes: It’s super early. I think it’s going to need fleshing out to get it right.

Audience: Part of the problem with this issue is obviously its global nature. You have 270 million Facebook users in India. You have 240 million Facebook users in the U.S. You have more Facebook users in Indonesia and Brazil combined than you do in the U.S. What’s their perspective on how we can tackle this issue and create a social contract in nations without the same democratic traditions at the U.S., but where there’s an even greater need for focus on this issue?

Harris: This is what keeps me up at night. I was on a panel with someone who’s head of policy for Facebook in Europe, talking about these issues of democracy and fake news. One by one, all these representatives from places like Sri Lanka and the Philippines, these countries where Facebook is creating cultural damage—they were kind of raising their hands meekly and saying, “Could you please take a look at the fake news in my country? Because it’s even more meaningful than it is over here.”

The problem is, how big of a budget of control are you going to need to deal with this issue? The fundamental thing is we’ve opened Pandora’s box. There are 2 billion people plugged in. Think about 2 billion live TVs. There’s a five-second delay on live TV for a reason. You have to be careful if someone says something. But now we have 2 billion live TVs. The amount of moderation or values or ethical discernment, all those processes and conversations you have to have, you can’t do that in all these different countries. Facebook probably doesn’t have engineers who even speak the language from many of these countries where it’s active.

What we need to start establishing is, what would that security budget and police force be? Five percent per country? Whatever enormous amount of staff they have to allocate in the short term. But in the long term it’s automation. These systems are so automated that it’s very hard to control.

Malik: A more important way to think about this is, how do you graft social and cultural genes into Facebook’s work? Any product or company like Facebook calls its customers “users.” That alone is the biggest problem. They should start by referring to them as citizens, more than anything else. Then they can take the step to having local bodies educating the mothership about how things work locally.

I remember being a reporter in India. Every time I read a story in the New York Times about India, I would think, “Are we living in the same country? How is this even possible?” That gap is more magnified on social media platforms. Viewing from afar, you can say, “These are the number of users we have in Brazil.” No, this is the number of people we have in Brazil. We have to think about them as people, not as data points. That cultural change needs to happen way before any rules come into play. That’s the most important thing they can do: establish what are almost embassies of learning and understanding local culture. That’s not a huge cost issue for a company.

Audience: The question I ask myself is, do I really need this? In the social engineering aspect, are we so socially engineered now that we are not going backward? We’re not going to a place where I get a flip phone and don’t text anybody?

Steinberg: I think that it’s entirely possible that we’ll turn around five or 10 years from now and say that this was all a big stupid waste of time. Facebook was a big stupid waste of time. Instagram was a big stupid waste of time. We’re just doing different things. We have better software so we’re not inundated with stupid emails all day. We spend a lot less time doing this.

There was a great article written in the New York Times where somebody explained—I can’t remember if this was the 1920s or the 1930s, but heroin dens were unbelievably widespread in New York City. An enormous number of people would go and do heroin all night long, basically. It’s possible we’ll look back on this as just a terrible time in human history, those days when we all spent tons of time staring at our phones.

Harris: I think that’s unlikely in the short term. But the history of technologies is—you start with the printed advertisement, and Tim Wu’s book traces that through radio and television. More and more time and more and more attention is owned. But we don’t see everyone stop using television. People still listen to the radio. Social media just gets added on.

I get more worried, then, that this is just added on, and the next frontier – augmented reality, or whatever else you’re most interested in – just gets added on too. I’m skeptical. Personally, I feel very addicted to my device. I go out of my way to try not to check email or Slack every hour, or on the weekends just put the phone away. If I’m in a room without a phone for more than an hour, I’m anxious, and I know why. I’m the type of person who, I like to think, is pretty aware of all of these things, aware of the behavioral effects, and yet I still suffer from it. I don’t know what to do for myself, let alone my kid.

Above: (Left to right) Willow Bay of the USC Annenberg School for Communication and Journalism, Om Malik of True Ventures, and John Steinberg of Cheddar.

Image Credit: Dean Takahashi

Steinberg: It’s cigarettes. I’ve used the analogy something like five times. Everything you’ve said there sounds like someone who needs a cigarette. I feel the same thing. A whole other panel is on children. For those of us that have young children, this is terrifying.

Bay: Wait, though. Millions of Americans have quit their nicotine habit. There must be a path. You don’t need to sound as defeatist as I’m reading it, do you?

Steinberg: I think it’s different than nicotine, though. It is often socially useful. This is the slot machine metaphor. Sometimes I really need to read my email.

Bay: But there’s a path between cold turkey and—why isn’t there a path toward a saner existence around social media?

Harris: The simplest thing, by the way, is turning off all the notifications on your phone, so it’s not always thudding against your skin. The second thing is setting your phone to grayscale. Just the colors on your screen can activate that banana-like reward for the chimp inside you. If you turn that off, it actually makes a really big difference.

We started the Center for Humane Technology with the premise that you have to redesign the entire technology stack from the ground up with a very vulnerable and manipulatable model of a human being. Take a magician’s view of the human mind. How can we be manipulated at every single level by a phone? It’s not just that it’s addictive in a generic way. It’s that the addictive stuff is right next to the stuff I have to do. I have to know if I’m late to a meeting and check my email, but that’s right next to Facebook, and I’m getting a ding. I’m getting that social approval.

You have to think of this like a badly designed urban plan. You wake up one day, and after 20 years of no zoning and no building codes, suddenly you’re in this casino environment. Everything’s insane. Nobody believes anything is true. Everyone’s addicted to slot machines. That’s the city you enter when you look at your phone. The answer isn’t to throw away the city. The answer is to have Jane Jacobs or someone like that ask, what makes a city livable for people?

You can ask that question in a humane way and not just design it, but also have business models that are non-extractive, not based on manipulation, and also humane policy. But you need to think about it from the ground up.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.