If you’re a U.S.-based Facebook user with somewhat politically active friends, your News Feed likely became a breeding ground for conspiracy theories and hyperpartisan memes in the months leading up to the 2016 elections. Among the users who became concerned about what they saw in their feed at this time was early Facebook investor Roger McNamee. A cofounder of investment firm Elevation Partners, McNamee was also an early mentor to CEO Mark Zuckerberg and suggested he hire Sheryl Sandberg as COO.

McNamee said he saw an increasing array of messages opposing Democratic presidential nominee Hillary Clinton inserted into his News Feed, spread by groups supposedly for Bernie Sanders, Clinton’s rival in the primaries. As he writes in a new book out today called Zucked: Waking up to the Facebook Catastrophe, “the rapid spread of images from these Sanders-associated pages did not appear to be organic. How did the pages find my friends? How did my friends find the pages? Groups on Facebook do not emerge full-grown overnight. I hypothesized that somebody had to be spending money on advertising to get the people I knew to join the Facebook Groups that were spreading the images. Who would do that?”

That was strike one in a series of events that prompted McNamee to write an email to Facebook CEO Mark Zuckerberg and COO Sheryl Sandberg in October 2016, in which he stated that “Facebook is enabling people to do harm. It has the power to stop the harm. What it currently lacks is an incentive to do so.” With that email, McNamee embarked on a two-year effort to speak out about the dark side of Facebook and social media.

VentureBeat obtained an advanced copy of the book, in which McNamee shares how he grew disillusioned with Facebook, as well as talking about the people he connected with over the past two years as he sought to amplify his message and understand the dangers of social media more clearly himself. Most of the ancillary players in Zucked will be familiar to people within the tech industry: Tristan Harris, who McNamee joined forces with to launch the Center for Humane Technology; Senator Mark Warner (D-VA), who has played a leading role in arranging congressional hearings with Zuckerberg and other tech executives over the past year and a half; and billionaire philanthropist George Soros, who in 2018 gave a speech at Davos warning about the dangers of social media.

For those who’ve been following Facebook’s attempts to appease lawmakers and members of the public who have become concerned about fake news, disinformation, and social media addiction over the past few years, Zucked doesn’t cover much new ground. The book’s also short on insider information about Zuckerberg and Facebook executives — as McNamee readily admits, before he emailed Zuckerberg in 2016 he hadn’t been in contact with him for about seven years.

Rather, McNamee told VentureBeat in an interview that the book is geared toward a broader audience, including parents of teenage children, that might not fully understand how products like Facebook have been designed to make divisive messages spread like wildfire and to encourage users to spend an unhealthy amount of time on the platform. The book is heavy on tech industry history, so it will also appeal to readers interested in understanding the forces that have shaped companies like Facebook.

In a statement to VentureBeat, a Facebook spokesperson said: “We take criticism seriously. Over the past two years, we’ve fundamentally changed how we operate to better protect the safety and security of people using Facebook. The reality is Roger McNamee hasn’t been involved with Facebook for a decade.”

VentureBeat spoke with McNamee before the publication date about his hopes for the book and how prepared he thinks Facebook, Twitter, and YouTube are to safeguard further elections.

Highlights from the interview, with answers edited for clarity and length, are below.

Why he decided to email Zuckerberg and Sandberg in 2016

McNamee: When I saw the things coming out of the groups associated with Bernie Sanders, it caught me off guard, because it never occurred to me that bad actors could possibly use Facebook’s architecture and tools to harm anybody.

Then, a month later, you had the story where Facebook had expelled this group that was scraping information on people, using the [advertising] APIs to gather data on people expressing interest in Black Lives Matter, and [the group] sold it to police departments. Now Facebook expelled the group, which was exactly the correct response, but I was thinking to myself that the harm had already been done.

Then you have Brexit, and Brexit’s the first time where I thought to myself, “Wow, is it possible that Facebook’s tools and advertising capabilities work better for incendiary messages than neutral ones?”

After that, we had the Department of Housing and Urban Development citing Facebook for advertising tools that violated the Fair Housing Act … it was the combination of all those things that made me reach out to Mark and Sheryl. But keep in mind, at that time I thought Facebook was the victim. And so I reached out hoping that I could be helpful. Then, when they weren’t interested in following through, I thought to myself “I really need to understand this better.”

What he discovered after meeting with Tristan Harris, a former design ethicist at Google and proponent of the ‘Time Well Spent’ movement

McNamee: It wasn’t until I met Tristan in April of 2017 that I understood the notion of brain hacking and the whole issue of how internet platforms can first create habits, and how those habits can evolve into addictions. I already knew that I was addicted [to Facebook]. What I hadn’t realized was that it was more than just my fault — I hadn’t realized that somebody was consciously trying to make that happen. They don’t think of it as addiction, they think of it as what they would call “engagement.”

[Another] problem here is that [social media companies] have accumulated massive political power, and they’ve left not just our country — there are many countries [affected] — vulnerable to outside influences. And they aren’t accountable to anyone. They didn’t get elected, and they pretend as though they’re not responsible for what happens.

How he thinks Facebook, Google, and Twitter did during the 2018 midterms, and how they’ll respond in future elections

McNamee: I think that 2018 represents a really important pivot point — less for what Facebook and Google did to prevent disinformation, [but more] for the evolutionary change in the people who use those products. A really interesting set of people — a minority still, but a pretty large minority of users — decided that they weren’t going to get their political news from social media anymore. And a surprising number of people chose to get directly involved in politics for the first time.

We’re still playing whack-a-mole. The platforms are still very vulnerable, and there could be new ways of using them [to spread disinformation]. If you look at Brazil, in the election that happened there last year, the big platform there was WhatsApp, and the same thing was going on in India.

I don’t know the thing we ought to be afraid of, but what I know is that the philosophy of Silicon Valley remains one of shipping products as rapidly as possible and then expecting the people who use those products to discover the bugs and effectively do all the policing for you. And I think in the context of elections, that’s not appropriate. I think you have to be more careful than that.

How he thinks Facebook, which just turned 15, will fare in the next 15 years

McNamee: The first thing I will say is Facebook is one of the most incredible success stories in the history of Silicon Valley. And the problems we’re dealing with here are problems that were created by a business model that produced fantastic earnings for the company, but also produced side effects with an unacceptable cost to society.

My hope is that the company will embrace the criticism and will address the side effects by changing its business model, to recognize that it succeeded beyond its wildest dreams and it’s time now to look at the bigger picture. I don’t know if they’re going to do that. They certainly haven’t shown any sign of doing that yet.

What I fear is that if the company’s unwilling to change itself, that change will have to be forced upon it by governments on the outside, whether it’s the European Union or the U.S. government or state governments, or some combination of them. And I can imagine a scenario where the natural evolution of tech, in combination with intervention by governments, makes the 15-year outlook for Facebook look pretty grim. I hope that’s not what happens. My hope is that they have a Susan Fowler moment, the way that Uber did, and move aggressively to address the flaws in the business model.