While some lawmakers and privacy advocates have criticized Mark Zuckerberg’s recent call for greater government oversight, deeming it PR-driven and self-serving, the proposal he outlined goes a long way toward reforming an issue of enormous significance in the tech space. He advocates for third-party content standards, which would sharply reduce Facebook and other companies’ power to decide what constitutes protected speech.
Freedom of speech, guaranteed under the First Amendment of the Constitution, is among our most basic and fundamental rights. Its protection on social networking platforms has become increasingly vital as these sites serve as near-ubiquitous, monopolistic communication vehicles. Preserving this freedom while stemming the proliferation of hateful or harmful online content is an extremely difficult, delicate task – one that should not fall to corporations.
Rather, representatives of the people must do the difficult work of setting standards for protected online speech, ensuring regulations don’t open doors for government-led censorship or create open-ended precedents that would allow expansive regulation of content. When disputes arise, and they inevitably will, courts and other independent judicial bodies must have the final say over what does and does not constitute a violation of constitutional rights.
Determining who sets these standards is crucial given the high stakes and potentially dire consequences.
Take the tragedy in Myanmar, for example. Widespread reporting has illustrated how the country’s military leaders intentionally used Facebook to incite systemic violence against the Rohingya minority, resulting in the largest forced human migration in recent history. Since then, the United Nations, among other groups, has stated that by failing to take appropriate action at the time these atrocities were occurring, Facebook contributed to acts of genocide.
By all accounts, the implications of this tragedy have not been lost on Facebook. The company sought to investigate its role in the genocide and met with Myanmar officials. Notably, it also saw its stock price and employee morale plummet, reflecting the company’s other recent corporate missteps, including the handling of numerous data privacy scandals and its role in the 2016 election. Still, Facebook’s decision this fall to create an independent oversight group to review content-moderation appeals seemed to stem from genuine corporate soul-searching on Myanmar as well as a recognition that the role of differentiating protected from unprotected speech is simply bad for business.
But Zuckerberg’s call for third-party content standards goes a step further. In endorsing such oversight, he’s signifying an understanding that clear rules – set, governed, and enforced by independent bodies – are not just a public good and a basic principle of the rule of law, but would be highly beneficial to companies like Facebook.
In truth, Zuckerberg’s proposed regulatory framework has many potential corporate benefits. It would allow companies to avoid the tremendous, established dangers involved in making decisions that significantly impact our most basic freedoms, and it would provide greater legal certainty about what is and isn’t acceptable business behavior in highly sensitive operating environments. This would help mitigate risk, limit liability, and foster a stable, less perilous environment in which to do business.
This pro-regulatory stance was once unthinkable in Silicon Valley. Ignoring the progress this represents does not help in our efforts to convince the technology industry to proactively support the work of legislators and regulators.
While Zuckerberg’s critics are correct in their assertion that Facebook’s actions stem from self-interest and an obligation to shareholders, the mitigation of harmful, online content is an area in which the interests of Facebook dovetail with those of the public.
Lise Smit is Senior Research Fellow for Business and Human Rights at the British Institute of International and Comparative Law
Ulysses Smith is a U.S.-based attorney, Director of the Business and the Rule of Law Program ,and a Senior Research Fellow at the U.K.’s Bingham Centre for the Rule of Law.