On Wednesday morning, the Facebook Oversight Board, the global deliberative body set up by Facebook to adjudicate content decisions, finally issued a decision on the indefinite suspension of former president Donald Trump from the platform. Facebook has spent two years, countless hours and $130 million setting up the board, in part to outsource responsibility for making controversial and unpopular decisions. So when the biggest hot potato of recent years — the insurrection of Jan. 6 and Trump's role in instigating it — landed in Facebook's lap, company executives quickly tossed it to the new board.
And the board, in an almost 10,000-word decision making technical arguments about the rule of law and procedural fairness, threw it right back to Facebook.
The background of the Trump case is crucial to understanding what the board did. For years, Facebook created and maintained formal and informal carve-outs to its community standards for powerful individuals on the ground that such individuals were "newsworthy." Had Trump been an ordinary individual, he would have been a serial violator of Facebook's rules and kicked off the site long ago. But Facebook kept him on.
Facebook's newsworthiness exemption was — as many inside the company said at the time — a terrible idea. The definition of newsworthiness is subjective and circular, and using it as a guidepost effectively made Facebook the arbiter of what is or isn't news. More important for this case, giving people special treatment because their speech was newsworthy had predictably awful effects: When you give exceptions to the most powerful people in the world, they will abuse their power. That's exactly what Trump did. Create a terrible rule, and you get terrible consequences.
On Jan. 6, this accommodation to power finally blew up in Facebook's face. After initially taking down two of Trump's posts, and then his account, Facebook chief executive Mark Zuckerberg announced the following day that Trump was suspended "indefinitely." Around the world, leaders from Russian dissident Alexei Navalny to German Prime Minister Angela Merkel decried the decision, alarmed at the power of the platform to unilaterally remove a democratically elected leader, demagogue though he might be. In response to public pressure, Facebook availed itself of the new Oversight Board — an independent organization of its own creation staffed with experts in international human rights and freedom of expression to hear user appeals of Facebook's content moderation decisions. On the evening of Inauguration Day, Jan. 20, Facebook sent the question of "Trump's indefinite suspension" to the board for a binding decision and policy advice on how to deal with similar issues surrounding world leaders in the future.
It was a critical moment for the board, which was not even a year old. It was also a critical moment for Facebook, whose leaders probably hoped that the board would relieve it of responsibility for making a tough decision. Whether the board restored Trump to the site or kept him out, Facebook could point to the decision and say, "They made us do it."
The board, which models itself after an appellate court, was not willing to be used in this way. Its decision pointed out that Facebook had not employed clear rules and that it had made up new ones on the fly to dump Trump. That was not consistent with rule-of-law values or with procedural fairness. Moreover, the board found that Facebook had merely suspended Trump "indefinitely." This implied that there would be a final determination somewhere down the line. No such determination ever came. Instead, Facebook had simply offloaded the case to the board. But the board made clear that while Facebook can keep Trump off for now, if it wants to expel the former president permanently, it will be Facebook, and not the Oversight Board, that must create actual rules for doing so and apply them to Trump's case.
The board also pushed back in two other ways. Each presages future conflicts between the board and the company that created it.
In its arguments before the board, Facebook had denied that it "has never applied the newsworthiness allowance to content posted by the Trump Facebook page or Instagram account." If Facebook was saying that Trump has never gotten special treatment, this was either false or seriously misleading. And if Facebook expects the board to act like an independent appeals court, making arguments like that is a mistake: The one thing you never do before a court is try to hoodwink it. This put the board in a bit of a bind. Without calling Facebook out directly, the board simply noted that Facebook's "lack of transparency" creates the perception that "the company may be unduly influenced by political or commercial considerations." Hence, the board demanded that Facebook address "widespread confusion about how decisions relating to influential users are made."
Facebook had also told the board that Trump's "repeated use of Facebook and other platforms to undermine confidence in the integrity of the election . . .. represented an extraordinary abuse of the platform." In response, the board asked Facebook how much its own algorithms and technological design had "amplified Mr. Trump's posts" and contributed to the Jan. 6 riots. This came uncomfortably close to asking Facebook how its business model might have incentivized conspiracy theories and violence. Facebook "declined to answer these questions," the decision noted. This may be one of the most important sentences in the opinion. If Facebook wants the board to be respected as an independent court, stonewalling its questions won't help. Stymied by its creators, the board wrote that Facebook's refusal to disclose how it uses algorithms to shape public discourse made it difficult to give the company the benefit of the doubt on whether it acted reasonably in responding to Trump. The board clearly recognized that publicizing Facebook's refusals to cooperate or provide information is the best and possibly only way to put pressure on the company to behave more responsibly in the future.
Facebook created the Oversight Board to outsource responsibility, buy a little legitimacy, and offload difficult decisions about how to adjudicate speech claims to someone else. The board's most high-profile ruling to date shows that the board won't always be willing to play along. Instead, sometimes the world's largest social media company will have to take responsibility for itself.
Jack M. Balkin is Knight Professor of Constitutional Law and the First Amendment at Yale Law School. His latest book is "The Cycles of Constitutional Time."
Kate Klonick is an assistant professor at law at St. John's University Law School and an affiliate fellow at the Information Society Project at Yale Law School. This piece was written for The Washington Post.