In a fit of frustration last week, following the U.S. presidential election results, I shared a news story to my Facebook feed about an incident in Michigan involving school kids chanting “Build that wall!” at their Latino classmates.

The story originated at the Detroit News, where reporters saw a video of the incident on Facebook and then spoke to witnesses and got statements from school officials confirming what had happened. In response to my post, a friend posted a video from a pro-Trump propaganda site called Infowars, with the headline “Shock video: Black mob viciously beats Trump supporter.” He wondered, sarcastically, if president-elect Donald Trump should also be blamed for this?

The video shows some African-Americans pulling a white man from a car at an intersection, and a female can be heard (but not seen) yelling, “You voted Trump!” Some friends, on both the Right and the Left, criticized my initial suspicion that the video was fake, saying the video couldn’t be denied because, well, it was video proof.

Why was I doubtful? Infowars has a classic style of taking quotes or video and wrapping them in misleading context to rouse conservative followers. As it turns out, Snopes had already debunked the video as being about a traffic altercation that had nothing to do with Trump voters.

I was thinking about this exchange as I read Facebook founder Mark Zuckerberg’s defense of the platform against charges that it plays a role in disseminating fake news. He says less than 1 percent of what is shared on Facebook  is “fake news.” So, how precisely does Facebook, when measuring that statistic, differentiate between the two above examples?

Lots of stuff from Infowars and conservative website Breitbart gets shared across Facebook. Most people on the left would argue that almost all of it is fake or distorted in some fashion. We could be wrong. But how, precisely, would Facebook know? And if it did find out, at some point, how would it correct or stop the spread of such stories?

To be clear, this isn’t just about Facebook. It’s 4chan, Twitter, Reddit. Hell, it’s the internet. The whole damn web, really. It’s not just that teenagers in Macedonia can now make big money generating fake news; it’s that social media accelerates diffusion, and search via Google allows you instantly to find some “news” story that supports your world view or interpretation of an event.

Last week, as the election unfolded, I was in Lisbon along with 53,000 other attendees for the Web Summit. This is typically a gathering of techno-optimists who believe, or want to believe, that what they are doing is making the world a better place. But even before the U.S. election results came in, there was already a lot of discussion about the growing role the internet is playing as a source of disinformation and divisiveness.

One particularly compelling bit of analysis came on Tuesday, just as U.S. voters were about to head to the polls. The speaker was Ann Mettler, head of the European Political Strategy Centre (EPSC), the in-house think tank of the European Commission. A self-described former techno-optimist, Mettler has found her views of digital and social media changing since she became a bureaucrat.

“I now see also a somewhat darker side of the internet emerging,” she said.

She noted that this ugliness was manifesting itself across Europe, with the Brexit vote in the U.K. and the rise of Right-wing parties across the continent. But she found the insurgent Trump campaign particularly disturbing.

“I am personally convinced [that] were it not for social media, he would not have been the candidate for the Republicans,” she said. She pointed to a statement Trump made last year in which he boasted about the power social media gave him:

This ability to have unfiltered conversations with the public has transferred the balance of power away from traditional media. Of course, for years, techno-utopians have hailed this as a great democratizing force that would allow the people’s voice to be heard across the world. Now, it is taking a more ominous turn.

Technology wanted to enable the crowd. Instead, it has unleashed the mob.

“The role of media was to mediate,” Mettler said. “They would choose what was really newsworthy for us to read or consume. But we no longer have that. And I think it has profound ramifications.”

That idea of a journalist-mediated conversation is a quaint notion that had its own flaws. But that doesn’t matter in many ways because it is an era that we will never go back to.

As I was traveling back home from Lisbon on Friday, I saw just the latest manifestation of this new social media news era on Twitter when “GrubHub CEO” started trending. Supposedly, GrubHub CEO Matt Maloney had sent an email to employees saying that anyone who voted for Trump should resign. This sparked predictable outrage from both the Left and the Right.

Except, of course, he didn’t actually say that. What he did say is that the company did not share Trump’s values of hate and discrimination. And he said that anyone who did not agree that hate and discrimination were bad things should leave. It’s an important distinction that many Trump voters make (voting for Trump doesn’t make me a racist!), but that was, of course, lost in the social media cacophony.

Amazingly, the groups picking up this story included The Washington Post: “Grubhub CEO says employees who act like Trump ‘have no place here.’” It was a neat little twist, playing for the virality while attempting to stay just inside the line of being factually truthful.

And again, this is where things get dicey if you’re going to try to parse “real” and “fake” news. It’s one thing to recognize an obviously silly fake news URL. But the Post itself, while doing some great investigative reporting this year on the candidates, has also mastered the art of the misleading clickbait headline to drive traffic.

Consider this array of headlines that appeared in just 24 hours on the Post’s website during the primaries:

sanders_fari-wapo

You can get a list of URLs for the stories here. But one thing that almost all of them have in common is that the article bears little resemblance to the headline. Or the headline writer made a leap in logic that walks right to the edge of propaganda.

And it’s not just the Post. Mettler said that, in general, as media around the world gets disrupted, news organizations are learning the same lessons as those teenagers in Macedonia: Populists and demagogues drive pageviews and ratings. She noted that CNN received one of its highest ratings ever when it hosted a Republican primary debate — in large measure because Trump was involved.

“A populist will help drive business for media,” she said. “Which, in turn, gives them an incentive to give visibility to the populist. It’s not just about social media. It’s about the regular mainstream media.”

Mettler’s prognosis is not a particularly happy one.

“Democracy is about openness,” she said. “But if that openness is used to destroy the system, that’s a real wakeup call.”

The question now facing big Silicon Valley firms is what obligation they have to fix all of this. Zuckerberg’s comments, while acknowledging the issue, seem to be an attempt to shift any blame away from Facebook. Twitter has been trying to deal with hate and trolls for years, and it has largely failed.

Mettler would like to see more efforts made in this direction, and she hopes these companies realize that their brands and reputation could suffer over the long-term if things continue to spiral out of control.

“We’re starting to realize that something potentially very detrimental is starting to happen,” she said. “The internet firms have a big incentive to be part of the solution.”

But nothing will likely happen without extreme pressure from consumers. And that’s problematic, because these same consumers are the ones reading and sharing fake news that conforms to their beliefs. A public that is willfully misinformed may not realize that it is a threat to democracy.

“I am worried that there is a new generation that maybe thinks about the rights we have in a democracy but not the responsibility,” Mettler said. “We have to be informed to a certain degree. Democracy, in and of itself, as we have seen, can be quite fragile.”

Here is Mettler’s full talk: