Can an Oversight Board Created by Facebook Actually Fix the Company’s Failings?

Deciding what to do about Trump’s ban is only one thorny question. They just might ignore the others.

Former President Donald Trump leaves Trump Tower in Manhattan on March 9, 2021 in New York City. James Devaney/GC Images

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

When Donald Trump reemerged at last month’s Conservative Political Action Conference to give his first public speech since leaving the White House, he made the animating focus of his post-presidency perfectly clear: he won the 2020 election, which was stolen from him. This lie, which led to the deadly insurrection at the US Capitol and still threatens American democracy, remains Trump’s burning obsession. We just haven’t heard him say it much over the past several weeks because the former president can no longer post to Twitter and Facebook. 

Both social media companies took action to remove Trump after January 6’s riot, having determined he used their platforms to promote violence. Twitter made Trump’s ban permanent and purged his posting history. Facebook chose a different tack. It kept the former president’s Facebook and Instagram accounts up, but prohibited new posts—at least until its Oversight Board could weigh in. The company has asked the board, which began rendering decisions this year, to decide whether Facebook should keep Trump’s ban in place or to welcome him back to the world’s largest social media platform. It is a significant decision for the country, the company, and for the Oversight Board itself.

“Whatever the Oversight Board decides is going to be pretty consequential for the company and for the Oversight Board’s credibility,” says David Kaye, an expert in human rights law at the University of California-Irvine School of Law.

Facebook’s Oversight Board first met last year after two years of planning. While the company has touted it as an independent body that will make final content moderation decisions, thus far, the board has only decided a handful of cases, and concerns about its true independence remain. The outcome of the Trump case could either help build the body’s reputation among academics and civil and human rights advocates as a valuable shield against harmful content—or dash hopes, both in the board and in Facebook’s broader future as an ally of democracy and enemy of hate.

“This is a critical moment for Facebook, for the social media landscape more broadly,” says Janai Nelson, associate director-counsel at the NAACP Legal Defense and Education Fund, who favors a permanent Trump ban. Her organization is one of several civil rights groups that have worked with Facebook for years to address harms to Black communities on its platform. “Facebook can either regulate itself and show that it has the capacity to establish an independent oversight board that can make appropriate and timely determinations, or it is inviting immediate and aggressive federal regulation.” 

“They need to keep Trump off the platform, they need to protect the public interest,” says Carmen Scurato, senior policy counsel at Free Press, a digital rights group. “If they don’t, if they reinstate Trump, I think it does show that they’re not doing things in the public interest.”

The Oversight Board is run by a trust created by Facebook. While slated to have 40 members by the end of the year, it currently has just 19, largely lawyers and human rights experts drawn from around the world. Each case presented to the board is heard by a new five member panel. Their decision is final once approved by a majority of the full board.

Facebook created the board, wrote its charter, and funded its first six years of operation with $130 million. And while it has likened the board to a court, it is clearly a corporate, non-public entity. For now, the board can only consider whether to restore content Facebook has removed—so-called “takedowns”—as well as referrals from Facebook on other moderation decisions, making its mandate quite narrow. And while the board can make policy recommendations, they are just that. The board has no power to address Facebook’s controversial business practices around key issues, like micro-targeted advertising, the sorting of users into ideological bubbles, or the algorithmic amplification of misinformation. The board and Facebook expect to announce in the next few months that its powers will expand to allow the review of content that remains on the platform. That modest expansion of powers would keep the board trained on reviewing individual pieces of content—posts, comments, photos, and videos—not the larger questions about ads and algorithms that are at the heart of Facebook’s ability to cause harm.

Supporters of the board hope it will make meaningful recommendations that, over time, will influence Facebook. Others stress the board should be thought of not, as the company has encouraged, as an independent court, but simply as a possible extra shield against harmful content moderation decisions—not a stand-in for larger reforms or regulation.

Given all that, there are many who have written off the board’s utility, seeing it as a fantasy of corporate self-regulation. “It will have no authority over advertising or the massive surveillance that makes Facebook ads so valuable,” Siva Vaidhyanathan, a media studies professor at the University of Virginia, wrote in Wired last May. “It won’t curb disinformation campaigns or dangerous conspiracies. It has no influence on the sorts of harassment that regularly occur on Facebook or (Facebook-owned) WhatsApp. It won’t dictate policy for Facebook Groups, where much of the most dangerous content thrives. And most importantly, the board will have no say over how the algorithms work and thus what gets amplified or muffled by the real power of Facebook.”

Zephyr Teachout, a Fordham University law professor and Big Tech critic, used a succinct analogy to make a similar argument. “If Exxon hired a bunch of academics and thought leaders and paid them through a trust to occasionally tell them where not to drill, I would not say we’d figured out a mechanism for dealing with fossil fuel,” Teachout tweeted last month. Many others have expressed concerns that the board could emerge as a distraction from Facebook’s more fundamental failings.

“What I worry is going to happen here is that there’s this kind of theater around the board’s decision that is entrenching the notion that it’s the content moderation decision that matters,” says Jameel Jaffer, executive director of the Knight First Amendment Institute at Columbia University. “It’s much, much less important than all these design decisions”—algorithms that sort people into filter bubbles and amplify hate, for example—”that Facebook isn’t talking about, that Facebook doesn’t want anyone to talk about. And that Facebook will never turn over to the board, because those design decisions are what ultimately determine whether Facebook makes money or not.” 

In an outside comment, one of thousands that activists and advocates have formally submitted to the board, the Knight Institute complains that “the board has effectively been directed to take Facebook’s design as a given, but it shouldn’t be treated as one.” The comment urges the board to refuse to rule on Trump’s case until Facebook commissions and turns over “an independent study of how its platform may have contributed to the events of January 6th.” 

There’s evidence that board members may be bristling at their constraints. “I think the board will want to expand in its scope. I think we’re already a bit frustrated by just saying take it down or leave it up,” Alan Rusbridger, a board member and the former editor of Britain’s Guardian newspaper, recently told a UK House of Lords inquiry into online freedom of expression. “What happens if you want to… make something less viral? What happens if you want to put an interstitial?” He added, “At some point we’re going to ask to see the algorithm, I feel sure—whatever that means.”

In addition to Facebook’s algorithmic design, there is also a broader context for understanding how speech courses through the internet that the board should consider—though whether it’s equipped to is unclear. “It’s an online platform, and you need to think about the way in which accounts speak to their audience and integrate with other accounts,” says Kaye, who previously served as a United Nations special rapporteur on free speech. “Not just did this particular post incite this particular violence. But did the account as a whole play a role in inciting violence? Do they expect that it will continue to do so? And that’s just a pretty deep contextual analysis of how we think about incitement and speech in a digital space.” 

So far, the board has decided just six cases, reversing, in all but one instance, a Facebook decision to take down a piece of content. The board’s administrative director, Thomas Hughes, has highlighted the fact the board is ruling against the company as a demonstration of its independence. But for some civil and human rights advocates, those decisions have offered worrying signals about the board’s values. In one case, for example, the board restored an anti-Muslim post in Myanmar, a country where Facebook once helped spark an anti-Muslim genocide. The board reasoned that the post, which showed a dead Syrian child alongside text stating there is something wrong with Muslim mens’ mindsets or psychology, was offensive but did not rise to the level of hate speech. In several other opinions, the board made similar judgments, deciding that content taken down by Facebook was not dangerous enough and should remain up—decisions that demonstrate an inclination to privilege the speech rights of the poster over the harm their speech could cause to others.

The Trump case will test the board in critical ways. At issue, according to the board’s summary of the case, are just two posts made to Facebook and Instagram on the afternoon of January 6, when the Capitol was under attack. The first is a video of Trump sympathizing with the rioters but asking them to leave peacefully. In the second, Trump repeats that the election had been stolen and tells his supporters to “Remember this day forever!” Both were removed under Facebook’s policy against dangerous individuals and organizations, and they prompted Facebook to suspend Trump’s posting abilities—first for 24 hours, and then indefinitely.

In referring the case to the board, Facebook posed two questions: Did it make the right decision in revoking Trump’s posting privileges? And what is the board’s recommendation on suspensions when the user is a political leader? Under Facebook’s instructions, the first decision will be binding—and the second merely advisory.

Civil rights groups and academics are concerned that the board will let Facebook dictate how they review the case. A common theme among the   outside comments the board has received are pleas to go beyond those two posts when deciding whether to restore Trump’s access.

“Facebook is framing the case. And that, I think, is something that the Oversight Board needs to grapple with,” says Scurato. “If they accept Facebook’s framing from the get go, then I think they’re doing a disservice to themselves and to the process, and really to the public and to everyone’s safety.” According to the board’s bylaws, it can request additional information from Facebook as well as commentary from outside experts as it makes decisions. Though the rules make clear Facebook is not required to grant the board’s requests, they also don’t limit the scope of the board’s review.

Trump has a long history of violating Facebook’s rules. And Facebook, until January 6, had a history of looking the other way. The company repeatedly excused his posts and changed their policies to accommodate him, eventually applying a “newsworthiness” exception to politicians to allow them to post misinformation under the premise that what they say should be “seen and heard.” In 2015, Facebook kept up Trump’s call for a ban on Muslims entering the country despite it violating its hate speech rules. In 2018, it allowed Trump to post a video that dehumanized immigrants from South and Central America even though the video had already been deemed too racist to run as a paid ad. During the protests over George Floyd, Facebook left up Trump’s “When the looting starts, the shooting starts” threat of violence against demonstrators. Around that time, Zuckerberg went on Fox News to defend his decision not to label Trump’s post that mail-in voting leads to fraud as false—a claim Trump used for months to lay the groundwork for the dangerous lie that the election was stolen from him. When Trump lost the race and continued to baselessly allege significant fraud, Facebook began affixing links providing corrections and accurate information. But the posts remained up, and the “big lie” that Trump won gained steam, culminating in the deadly events of January 6.

It’s not possible, academics and civil rights activists argue, to understand the threat Trump poses if the board does not look at this broader context. “A decision to reinstate Trump, whose ongoing capacity to do serious harm is not captured in two decontextualized posts, is no less than the Oversight Board endorsing bigotry and hatred,” reads a comment submitted to the board by the Change the Terms coalition, which advocates for policies countering online hate speech. In an op-ed in Wired representing the University of North Carolina’s Center for Information, Technology, and Public Life, media scholars Daniel Kreiss and Shannon McGregor put the stakes even higher: “The danger here is that if the Oversight Board does not uphold Trump’s ban, it will set a precedent of valuing political elites’ expression over the right of the public to self-govern.”

There’s some logic to the claim the board has demonstrated independence with its spate of decisions overruling the company and restoring removed posts. A decision to return to Trump control of his accounts could arguably be seen in the same light, a demonstration of the board’s willingness to reverse one of Facebook’s most high-profile decisions. But with the Trump case, the board’s greatest opportunity to buck the company may be to keep Facebook’s ban in place, but only after conducting an assessment of the risk Trump poses beyond those two posts. If it declines to undertake such a broader historic look, the limited grounds of the case might force the board’s hand into reversing Facebook decision, providing a stark demonstration of the consequences of refusing to grapple with issues outside of the scope laid out by Facebook.

For years, Facebook has gone out of its way to protect Trump, and in doing so embraced a position that politicians’ speech should not be censored. Though Facebook finally found, after violence wracked the capitol, that the balance of harm tilted against Trump, in practice, it still endorses the underlying concept that world leaders’ words should not be restricted. If the board restores Trump’s posting abilities, that would be a reversal of Facebook’s ban, but an approval of the company’s overall hands-off framework for political figures. But if the board keeps the ban in place and takes the extra step of advising that world leaders should not have extra privileges on the platform, it would be a much more dramatic affront to Facebook—and one that could stem other dangerous leaders’ ability to spread their message.

How the Oversight Board tackles these tasks will go a long way in determining whether civil rights advocates and others have faith in its ability to mitigate the harms Facebook has caused. But they may be hoping for more from the board than it was ever set up to achieve.

“We’re hopeful that what was a clear effort to incite a violent overthrow of a branch of government is going to be recognized as such by this Oversight Board,” says Nelson from NAACP LDF. We “strongly urged this Oversight Board to set a bar. And frankly, this is a very low one. So if they can’t set it here, then it’s hard to understand what their judgment will be on any other matters.”

DONALD TRUMP & DEMOCRACY

Mother Jones was founded to do journalism differently. We stand for justice and democracy. We reject false equivalence. We go after stories others don’t. We’re a nonprofit newsroom, because the kind of truth-telling investigations we do doesn’t happen under corporate ownership.

And we need your support like never before, to fight back against the existential threats American democracy faces. Fundraising for nonprofit media is always a challenge, and we need all hands on deck right now. We have no cushion; we leave it all on the field.

It’s reader support that enables Mother Jones to report the facts that are too difficult, expensive, or inconvenient for other news outlets to uncover. Please help with a donation today if you can—even a few bucks will make a real difference. A monthly gift would be incredible.

payment methods

DONALD TRUMP & DEMOCRACY

Mother Jones was founded to do journalism differently. We stand for justice and democracy. We reject false equivalence. We go after stories others don’t. We’re a nonprofit newsroom, because the kind of truth-telling investigations we do doesn’t happen under corporate ownership.

And we need your support like never before, to fight back against the existential threats American democracy faces. Fundraising for nonprofit media is always a challenge, and we need all hands on deck right now. We have no cushion; we leave it all on the field.

It’s reader support that enables Mother Jones to report the facts that are too difficult, expensive, or inconvenient for other news outlets to uncover. Please help with a donation today if you can—even a few bucks will make a real difference. A monthly gift would be incredible.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate