In case you missed it, Facebook announced yesterday that it was unpublishing four pages belonging to fringe conspiracy site Infowars and its main host, Alex Jones, for violating the social network’s hate speech guidelines.

That was just a few hours after Apple announced it was yanking Jones’ podcast for violating hate speech and Spotify removed all episodes of the Alex Jones Show. Then YouTube unpublished Infowars’ channel. Then Pinterest (yes, Infowars had a Pinterest board, for some reason), announced it was banning Infowars, too.

Despite all of these bans happening within 24 hours of one another, no company acknowledged that the others’ actions had any affect on their own. Rather, most gave a blanket statement to the effect that Infowars and Jones violated their platform’s rules against hate speech.

Only YouTube and Facebook went into any detail — Facebook sent out an 846-word press release, stressing how it enforces its community standards and explaining that the four aforementioned pages had accumulated too many strikes for publishing content that glorified violence and had used “dehumanizing language to describe people who are transgender, Muslims, and immigrants.”

YouTube also said it deleted Jones’ channel for repeatedly violating its hate speech policies, and for trying to circumvent a previous ban YouTube had placed on Jones — he was banned from livestreaming for 90 days in July but continued to livestream on other accounts.

In any case, the ban is long overdue, as Jones and Infowars have for years published videos that contain hate speech or threats of violence, while also pushing baseless conspiracy theories, like one proposing that the victims of a mass shooting at Sandy Hook elementary school were actually child actors. Jones — who supported President Donald Trump during the 2016 election — also issued threats of violence against President Trump’s critics during broadcasts, only to later claim it was a “performance” and part of his effort to “destroy [critics] with the truth.”

But for those hoping that the Infowars ban means tech companies have finally seen the light and will start to crack down on hoaxes and hate speech, yesterday’s incident provides little comfort and offers few lessons on how to do so at scale.

To get Infowars banned took a depressingly creative array of community standards violations. YouTube was the first to issue a strike against Jones and Infowars in February, thanks to a relatively new update to a harassment policy that applies to people who target victims of mass acts of violence by claiming the incidents are hoaxes. The strike in question was applied to a video putting forth an unfounded suggestion that a survivor of a mass shooting at Marjory Stoneman Douglas High School was a “crisis actor.”

On the one hand, the strike was a much-needed signal that tech companies were willing to take a stand against conspiracy theorists. On the other hand, YouTube channels need to accumulate three strikes in three months in order to get banned, so come May Infowars was just as far from getting banned as it was in the beginning of February.

Two months later, the topic of banning Infowars regained prominence, as CNN asked Facebook during a Q&A why — if it was so intent on fighting fake news — it still allowed Infowars to operate on its website. The company tripped over itself in the following days as it tried to justify its position. First it said it wanted “different people” to be able to have “different points of views;” then it sought refuge by claiming that both the Right and the Left publish fake news, making Infowars unexceptional.

The peak of Facebook’s humiliation came during a podcast interview with Recode’s Kara Swisher, when CEO Mark Zuckerberg, in trying to defend Facebook’s decision to keep the Infowars page up, brought up people who deny the Holocaust as examples of those who should still have a voice on Facebook, because “I don’t think that they’re intentionally getting it wrong.”

Then online activist group Sleeping Giants also turned up the heat, tweeting at Spotify, Stitcher, Apple, Facebook, and YouTube to ask repeatedly how certain pieces of Infowars’ content could be said to not violate their terms of service.

Would Facebook have announced yesterday that it had banned Infowars’ Pages if Stitcher and Apple hadn’t yanked Infowars’ podcasts just hours earlier? Would we even be talking about Infowars being banned if YouTube hadn’t issued a strike against the channel in February? It’s unlikely Infowars would have caught the ire of so many online activists if it hadn’t had such a wide reach — into the millions of subscribers — and cranked out videos, podcasts, and Facebook posts that all contained multiple violations of hate speech policies.

There are, however, two lessons I hope tech companies will take from this. First, they shouldn’t be afraid to ban prominent, repeated violators of their community standards.

Aside from a few politicians and commentators wringing their hands about what this means for free speech on Facebook, there have been no en masse calls to investigate Facebook and no protests outside company headquarters. Infowars has been left with few defenders, because, well, claiming that child victims of mass shootings are faking it will do that to you.

Second, the best way to prevent future Infowars from bubbling up to the surface is to find ways to limit the reach of Pages and organizations that repeatedly push hate speech and conspiracy theories. Facebook — which seems to have come off looking the worst in all this — might have avoided a whole month’s worth of PR headaches if it had years ago stopped allowing Infowars to advertise on its site or refrained from suggesting that users “like” Infowars’ Page.

Facebook is now trying to stop the spread of fake news by removing monetization and advertising abilities from Pages that receive too many “false” rankings from its third-party fact checkers, but that’s not enough. The company also needs to be more transparent about how effective its fact-checking efforts are when it comes to the spread of fake news. The company claimed that once its fact-checking partners decide to add a “disputed” label to a story promoting fake news, that story sees a decrease in likes and shares. But how many likes and shares does it lose on average? And who are some of the biggest repeat offenders when it comes to promoting “disputed” stories? These are questions Facebook hasn’t answered yet.

Most of all, tech companies need to do a better job of enforcing their existing terms of service. As Josh Koskoff, a lawyer representing parents of Sandy Hook victims noted yesterday in response to the ban, “Unfortunately, for many of the Sandy Hook families, the damage has already been done.”