In January, Mark Zuckerberg decided that his resolution for 2018 would be to ‘fix’ Facebook, in the belief that “we’ll end 2018 on a much better trajectory.”
“The world feels anxious and divided, and Facebook has a lot of work to do — whether it’s protecting our community from abuse and hate, defending against interference by nation states, or making sure that time spent on Facebook is time well spent,” Zuckerberg wrote at the time. “We won’t prevent all mistakes or abuse, but we currently make too many errors enforcing our policies and preventing misuse of our tools.”
As the end of 2018 draws near, it’s safe to say Zuckerberg’s ‘much better trajectory’ has remained elusive. Facebook’s stock price has fallen from $181.42 per share at the beginning of this year to $134.52 at market close on Wednesday. It has managed to anger governments across the world due to negligence in protecting user data, poorly implemented ad products, and failure to stop bad actors in various markets from at best using sketchy growth tactics to get more clicks on their websites and at worst inciting genocide.
While not all of Facebook’s crises have been self-inflicted, some of the worst have been exacerbated by the company’s failure to quickly take responsibility for mistakes. Here are some of the most egregious examples over the past year.
Zuckerberg and Sandberg’s decision to keep quiet in the days following Cambridge Analytica
Facebook’s year got off to a bad start when by March it was facing bombshell reports from the Guardian’s Observer and the New York Times about how the company had failed to stop data analytics firm Cambridge Analytica from improperly using the personal data of nearly 86 million users for ad targeting. Cambridge Analytica, for anyone who missed it, had gotten the data from a researcher who created a personality quiz app called thisisyourdigitallife before Facebook restricted the amount of data app developers could gather, starting in 2014.
Before the Observer’s story was published, Facebook sent a letter to the publication threatening to sue, according to lead reporter Carole Cadwalladr. Then Facebook tried to get ahead of the story by announcing that it was suspending Cambridge Analytica, though the company’s first statement — issued by its general counsel, not Zuckerberg — tried to emphasize that things had changed by focusing on the fact that Facebook had already limited the amount of user information developers could access several years ago. The problem was that Facebook had let the genie out of the bottle by giving apps access to that data in the first place. Cambridge Analytica proved that bad actors could misuse such data for years to come, and Facebook needed to not just apologize for what it had done in the past but do a better job of monitoring developers in the present.
Zuckerberg didn’t speak with media outlets until four days after the incident, and Sandberg didn’t do interviews until nearly three weeks after, only adding to the feeling that the company had something to hide.
It’s hard to overstate just how much the Cambridge Analytica saga set the tone for the rest of Facebook’s year — the scandal even resulted in Zuckerberg testifying in front of Congress for the first time ever. If Facebook had been more honest in the days following revelations about Cambridge Analytica and admitted it needed to do a better job vetting apps, perhaps the calls for blood would have been lessened.
Zuckerberg’s comments on Recode Decode that he didn’t think Holocaust deniers were ‘intentionally getting it wrong’
Following the Cambridge Analytica scandal, Zuckerberg continued to do more interviews and host more conference calls with reporters in an attempt to win back some goodwill. This PR offensive included a visit to Kara Swisher’s Recode Decode podcast in July. The stop came amidst calls for Facebook to suspend conspiracy theorist site Infowars (more on that in a bit). In response to questions from Swisher about what should and shouldn’t be allowed on the platform, Zuckerberg argued that the company has a responsibility to give people a voice, even if he or other users find their views abhorrent. Unprompted, he brought up the example of Holocaust deniers.
“I’m Jewish, and there’s a set of people who deny that the Holocaust happened,” Zuckerberg told Swisher. “I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong, but I think — It’s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly.”
Critics thought Zuckerberg’s comments let Holocaust deniers off the hook. The Anti-Defamation League chastised Zuckerberg, noting that “Holocaust denial is a willful, deliberate, and longstanding deception tactic by anti-Semites that is incontrovertibly hateful, hurtful, and threatening to Jews.” Zuckerberg later clarified that “I absolutely didn’t intend to defend the intent of people who deny that.”
Zuckerberg’s answer was reflective of Facebook’s continuous failures to respond to a changing landscape. For many people, tech companies’ pledges to allow users to express distasteful views as long as they’re not directly encouraging violence is now seen less as free speech protection and more as a callous or profit-motivated move.
Waffling over whether to kick Alex Jones off the platform
Over the summer, Facebook and other websites faced increased pressure to kick Infowars’ Alex Jones off their respective sites. In July, a reporter asked Facebook during one if its routine meetings with journalists why Infowars was allowed to remain on Facebook, if it was indeed peddling fake news. The company’s head of News Feed, John Hegeman, tried to soften the issue by saying that Infowars simply had a “different point of view.” But Jones didn’t just have a different point of view — he spread a vicious and dangerous conspiracy theory that a shooting at Sandy Hook Elementary school in Connecticut six years ago was a hoax.
Facebook did hit Jones’ personal page with a 30-day suspension later that month for violating policies against promoting hateful content — but the company failed to suspend Infowars and Jones’ public pages, both of which have been guilty of similar violations. Facebook finally suspended both Jones and the Infowars’ page from its site on August 6, but only after Apple and Spotify removed episodes of Jones’ podcast from their platforms. Zuckerberg later told the New Yorker that it was Apple’s ban that prompted Facebook to permanently ban Jones and Infowars, as the episode convinced them that “we should move on what we know violates the policy.”
Here, Facebook missed a crucial opportunity to regain goodwill by showing that it was willing to be a first mover in banning a high-profile, repeat rule-breaker like Jones.
Botching enforcement of the ‘paid for by’ label in its political ad archives
Facebook needed to prove in 2018 that it could successfully stop foreign actors from trying to sway voters by posing as U.S. residents. So it started requiring U.S. political advertisers to verify their location and identity and dumped all U.S. political ads into an archive that users could browse if they wanted to see who was running ads and who was paying for them. Advertisers themselves were responsible for filling out the public “paid for by label.” But, as a Vice News investigation found, Facebook didn’t do a good job of ensuring advertisers were being truthful in their submissions. In a test, Vice was approved to run ads paid for by Mike Pence, ISIS, and all 50 U.S. senators.
The company tried to defend itself by saying that it can’t catch everyone who is trying to game the system. That’s true. But Facebook had to have known by now that self-disclosure has serious drawbacks and that this feature had the potential to be heavily gamed. Plus, Facebook should know the true identity of its advertisers — so why is disclosure left up to them in the first place?
Not being forthcoming about its relationship with public relations firm Definers
In November, the New York Times published an in-depth report looking at how Facebook and its top executives had struggled to contain the backlash from the platform’s role in the 2016 U.S. presidential election, as well as other scandals that called into question how well it is safeguarding the site.
One of the biggest sticking points raised by the article was Facebook’s relationship with a Washington, D.C.-based public relations firm called Definers. The firm essentially served as Facebook’s attack dog, using a conservative news site to push out articles criticizing competitors like Google and Apple. Definers also encouraged reporters to look into the financial connections between organizations critical of Facebook and financier George Soros — implying that these groups weren’t as much of a grassroots effort as they made themselves out to be. Because Soros has been the subject of anti-Semitic attacks, some critics viewed Facebook and Definers bringing up the Soros connection as itself anti-Semitic.
Almost immediately after the Times’ story was published, Facebook announced it was severing ties with Definers. The following day, Zuckerberg said he hadn’t known about the company’s relationship with Definers until the Times’ story was published, leading reporters to ask exactly who at Facebook did know about any work with Definers. Facebook’s top brass refused to directly answer the question until the day before Thanksgiving, when outgoing communications chief Elliot Schrage took responsibility for hiring Definers, with Sandberg acknowledging that she “also received a small number of emails where Definers was referenced.”
The episode had all the markings of a classic Facebook PR bungle — waiting to announce policy changes until after a critical story is published, feigning innocence about how ruthless the company can be with competitors and critics, refusing to respond to specific questions for days in the hopes that everyone will forget about the story, and finally providing an answer the day before a holiday when many people aren’t checking the news.