Everyone in the World Wants to Fix Facebook. Here’s Why No One Can.

“We’re going to see a lot of attempts to draft laws, and I think we’re going to see a lot of different experiments.”

Abaca Press / AP

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

While the United States holds congressional hearings to listen to tech giants say they want to do better, other countries are passing and promising new laws that aim to take action against the virulent spread of disinformation, violence, and hate speech.

Last week, British Prime Minister Theresa May unrolled a proposal that outlines new regulations for social media companies, holding them responsible for a “duty of care“—which includes strict penalties if hate speech is not policed. “The era of social media firms regulating themselves is over,” she says in a video on Twitter.

May is not alone in her efforts. Australia passed a law that bans “abhorrent violent material” on social media in the wake of the massacre in Christchurch, New Zealand, that left 50 people dead at the hands of a gunman who livestreamed his rampage. Canada is “actively” considering content moderation. Singapore plans to crack down on “fake news” with the Protection From Online Falsehoods and Manipulation Bill. And there is the long-standing “Great Firewall” of China, a catch-all term for a long-held government policy of extreme online censorship.

“I think we’re really at a turning point for this issue,” says Evelyn Douek, a doctoral candidate at Harvard Law School who studies how social media is regulated internationally. “But it’s other governments around the world that are making these decisions. The United States has largely absented itself from this field.” 

While the laws being proposed in other countries may sound similar, there is no ubiquitous model. Instead, each country is trying its own methods. “It’s not a split choice between no regulation and China,” she says.

We asked Douek how these laws are being crafted, what should be considered, and what Australia’s “mess” of a new law portends for the future of regulation.

Mother Jones: Can you explain how the new Australia law to ban harmful content would work?

Evelyn Douek: Well, the way that law works is it defines a category of content which it calls “abhorrent violence material.” So murder, a terrorist act, rape—it gives a list of offenses—and it says that a content provider or a hosting service has to remove this kind of content “expeditiously.”

It doesn’t define what “expeditiously” means, but this law has been passed in response to what happened in Christchurch. We know that the Australian government was unhappy with how long the Christchurch shooting video was out—which was about 80 minutes—so “expeditiously” likely means social media companies need to be getting that content down faster than that. If they fail to do so, they’ve committed a criminal offense. The law says that people can be imprisoned or, in the case of a company, it can be fined 10 percent of its annual turnover or revenue.

It’s a particularly poorly drafted law and it’s getting a lot of justified criticism, and I don’t think this means Mark Zuckerberg is going to be looking at the inside of an Australian prison cell anytime soon. First of all, it’s not exactly clear whether the law would apply to him personally, or his company, or to the people in charge of content moderation. But even if we were to say, “Okay, under this law an Australian court found that Mark Zuckerberg had committed this crime,” there’s still no way to enforce that unless he enters Australian jurisdiction, which he is unlikely to do.

MJ: Do we have a solid model of how nations should create these laws?

ED: No, absolutely not. In a lot of ways, this techlash over the past few years is bringing this all to a head. We’re going to see a lot of attempts to draft laws, and I think we’re going to see a lot of different experiments. But has any country got it right? Are there any models that experts agree on as being the correct way to do it? Absolutely not—yet.

MJ: So what does progress look like?

ED: No one has an answer to this problem at the moment. I certainly think that we are going to see a lot more regulation in this space. A lot of governments are talking about it, and we are just going to see a lot more laws, and we’re going to have to learn through experience. That is often that’s how law works: we learn through experimentation, which doesn’t necessarily sound ideal, but we’re grappling with this entirely new paradigm.

Are they going to be as bad as Australia? No, that example is particularly egregious. I’m Australian, so I guess I’m allowed to say this: The way in which that law was passed was a particularly poor showing. It was rushed through Parliament within two days.

MJ: To what degree will another nation’s laws affect a user in Ukraine or Bulgaria or the United States? How will these all work together to change social media for everyone?

ED: Well, we’re going to find out! The companies often insist on having a global set of norms—they like the idea of a uniform set of rules that apply around the world. Facebook talks about sort of thing: it helps the frictionless experience. Because if you have a patchwork of laws applying to a [Facebook] thread, you’d have some comments missing, and you’d have these replies to comments that aren’t there. It would be a bit of a mess.

I’m not sure how persuasive I find that argument. I think, realistically, it’s in their commercial interest to resist regulation because complying with lots of different laws around the world is expensive for them. It’s difficult to do. You need to have people who are focused on particular jurisdictions and are trying to apply the law in a localized way. And so I can see why they would be keen to avoid that. They would prefer it if everyone just agreed on something, and then they could just go about applying it. But the fact of the matter is that that’s not going to happen.

I think, realistically, the future that we’re heading towards is a much more local experience of the internet—where local laws apply to people’s experience online. Now, there is the caveat to that. There are going to be issues with enforcement. Will people be able to get around specific jurisdictions laws? Sure. But research shows that for the vast majority of internet users, the default laws are going to be the ones that apply to their experience.

It also does depend on platforms’ choices. Something that people are really worried about is that if it’s easier for them to apply a more uniform set of laws, it might be a race to the bottom. Where if one jurisdiction passes a particularly restrictive law—because it’s easier for Facebook to just apply that globally—it might end up being the experience of the whole internet. Someone in Ukraine’s social media changes because in Germany Holocaust denial is illegal.

MJ: What local laws would work? What would you like to see in legislation from a country?

ED: I would really love to see more transparency from platforms. We need more information about what is going on: What moderation are they’re currently doing? How is content spreading? Then we can know exactly what the problems are. We need that before we can have a really sensible conversation around solving them.

The other thing that is important to note is we still don’t also have a lot of transparency around how the regulations work either. In Germany, for example, there’s been a lot of discussion about NetzDG [the Network Enforcement Act, which went into effect in 2018]. It created a 24-hour deadline for manifesting unlawful content: If a company fails to take content down within 24 hours, they face liability. That creates an incentive for platforms. They’re on the side of caution and they take content down if they’re unsure because they just don’t want to face potential liability. That’s led to a range of content taken down that is problematic. The most famous example is when they took down a satirical magazine in 2018, on the grounds that it was hate speech.

The NetzDG says that social media companies need to release transparency reports about how much content these platforms are taking down, and so the companies release these reports that say they’ve taken down X amount of content, but it doesn’t tell us what that content was. We don’t know if that was even a correct decision or not. We can’t actually look at what it took down and see whether the law is being applied correctly.

MJ: To what extent is it going to have to be up to the companies to self-regulate?

ED: We do also need more self-regulation and more transparency within the platform about how they are making how their content moderation systems work and how they’re making decisions. I’m on the more optimistic side about the potential for this Facebook proposed oversight board.

I’m not necessarily so bullish on the fact that it’s going to solve all of our problems or be a paragon of due process and transparency. But the current system is so Kafkaesque that having some sort of process for ventilating what’s going on within the platforms—and getting those arguments out and developing norms in a more iterative, common law fashion—I think could be a dramatic improvement on the current ad hoc, opaque processes.

MJ: There are some concerns, too, about the ways increased transparency could further risk users’ privacy, right?

ED: Yes. Hate speech is a great example of that. It’s really hard to tell if something is hate speech in the abstract—you need to know contextual information in order to make that judgment. And so efforts like the Oversight Board that would, in theory, give us transparent decisions about how that classification was decided are important, but we also don’t want to expose these people’s private information to the public.

But do I think, specifically, oversight would increase the amount of information that the platforms collect about their users? I doubt it. They collect so much—as much information as they possibly can—already.

MJ: Mark Zuckerberg said he couldn’t imagine the issue of hate speech ever being “solved” in an interview. Do you think this can be fixed?

ED: Do I ever think we’re going to get to a place where there’s no controversy around content moderation on platforms? Absolutely not. Free speech was one of the most contested areas of law scholarship before this. We’ve been debating many of these questions for centuries. This long predates the Internet; whether hate speech should or should not be censored is an extremely old chestnut.

But do I think that we can do a lot better than my currently doing it in content moderation? Do I think we’re going to have to make significant strides in bringing greater transparency into what’s going on in these on these platforms? Yeah, I am optimistic that we can do better. 

This interview has been edited and condensed.

DONALD TRUMP & DEMOCRACY

Mother Jones was founded to do journalism differently. We stand for justice and democracy. We reject false equivalence. We go after stories others don’t. We’re a nonprofit newsroom, because the kind of truth-telling investigations we do doesn’t happen under corporate ownership.

And we need your support like never before, to fight back against the existential threats American democracy faces. Fundraising for nonprofit media is always a challenge, and we need all hands on deck right now. We have no cushion; we leave it all on the field.

It’s reader support that enables Mother Jones to report the facts that are too difficult, expensive, or inconvenient for other news outlets to uncover. Please help with a donation today if you can—even a few bucks will make a real difference. A monthly gift would be incredible.

payment methods

DONALD TRUMP & DEMOCRACY

Mother Jones was founded to do journalism differently. We stand for justice and democracy. We reject false equivalence. We go after stories others don’t. We’re a nonprofit newsroom, because the kind of truth-telling investigations we do doesn’t happen under corporate ownership.

And we need your support like never before, to fight back against the existential threats American democracy faces. Fundraising for nonprofit media is always a challenge, and we need all hands on deck right now. We have no cushion; we leave it all on the field.

It’s reader support that enables Mother Jones to report the facts that are too difficult, expensive, or inconvenient for other news outlets to uncover. Please help with a donation today if you can—even a few bucks will make a real difference. A monthly gift would be incredible.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate