Skip to main content

Will SCOTUS Overturn This 1996 Law Governing the Internet?

“I don’t know if I’ve ever seen lawyers do so much damage to their own cases.”

People wait in line outside the US Supreme Court in Washington, DC, on February 21, 2023.Jim Watson/AFP/Getty images

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

In November 2015, Nohemi Gonzalez was a 23-year-old California State University, Long Beach, student studying abroad in Paris. That’s when a coordinated series of terrorist shootings and bombings in the city killed 130 people. She was one of them. ISIS claimed responsibility for the attacks—but Gonzalez’s family charges that the massive search engine Google is also partly responsible. 

On Tuesday, they brought that argument to the Supreme Court in a case that has the potential to fundamentally challenge the relative impunity of internet platforms to host user-generated content freely. In Gonzalez vs. Google, the family argues that YouTube, which is owned by Google, assisted ISIS by recommending a stream of ISIS videos to terrorism-minded users through its algorithm. Google’s algorithm, they say, in effect helped the tech giant indoctrinate terrorists in violation of the Antiterrorism Act, which enables Americans to be sued for “aiding and abetting” terrorism globally.

Lawyers for Google, however, point to a 27-year-old statute they claim completely exonerates the platform. Magazines and other news outlets are liable for the content they write and publish, but internet platforms of user-generated content aren’t held to the same standard because of Section 230 of the 1996 Communications Decency Act. This asserts that no user “shall be treated as the publisher or speaker of any information provided by another information content provider.” In other words, internet platforms are almost never liable for what their users post; only the user is.

From massive content-generating companies like Google, Facebook, and Twitter, to small porn or munitions sites, Section 230 has been used to remove responsibility for anything their users publish, even if the content leads to tragedies like widespread election misinformation, an attack at the US Capitol, or the death of an American student in Paris.

For the first time ever, however, the scope of Section 230’s liability shield is now being challenged in front of the nation’s highest court. If Google prevails in the court’s decision—which is expected to be handed down by July—the status quo of internet platforms’ immunity over their user-generated posts will continue. Should the Supreme Court rule in favor of the Gonzalez family, the so-called “Magna Carta of the Internet” will be hamstrung, irrevocably changing the foundation of the world wide web and how billions of people interact on it. 

“The primary thing we do on the internet is we talk to each other,” Eric Goldman, a Santa Clara University professor specializing in internet law told the Associated Press, describing the stakes of eliminating Section 230. “The Supreme Court could easily disturb or eliminate that basic proposition and say that the people allowing us to talk to each other are liable for those conversations. At which point they won’t allow us to talk to each other anymore.”

During nearly three hours of oral arguments, the lawyer for the Gonzalez family, Eric Schnapper argued it wasn’t the fact that YouTube hosted ISIS-linked videos that should make its parent company liable for the student’s death. Instead, what was most important was the fact that YouTube repeatedly recommended such videos. 

His argument hinges on how the internet has changed since its nascent days when Section 230 was enacted. Back then, the primary goal of internet platforms was to acquire users. Now that internet usage across the developed world is nearly ubiquitous, an increasingly important goal of platforms isn’t just to attract users, but to keep them engaged longer so they can sell more advertisements for higher prices. They do this by tailoring content through algorithms that analyze individual user data. 

“This was a pre-algorithm statute, and everyone is trying their best to figure out how this statute applies,” Supreme Court Justice Elena Kagan said, summarizing the question before the court. “Every time anyone looks at anything on the internet, there is an algorithm involved.”

But the pervasiveness of internet algorithms—which underpin the case—may also be what sinks it. If the Court were to limit Section 230’s protection of YouTube’s algorithm that allegedly fed ISIS videos to people susceptible to joining the terrorist group, that might undermine Section 230 in every other instance where it currently applies. For example, dating apps use your swiping histories to recommend potential new matches; TikTok does the same to recommend tailored videos; as does Instagram when showing posts likely to engage you. 

Even if other platforms’ algorithms may not lead users to ISIS content, Kagan suggested, different algorithms may lead users to other bad content. “Maybe they’ll produce defamatory content, or maybe they’ll produce content that violates some other law. Your argument can’t be limited to this one statute,” Kagan said, referring to the Antiterrorism Act. “It has to extend to any number of harms that can be done by speech, and so by the organization of speech, in ways that basically every provider uses.”

Schnapper stated there were ways for internet platforms to host content without recommending it, as the Gonzalez family says YouTube did. Google lawyer Lisa Blatt countered that content published by large internet platforms with copious search results is comparable to “helping users find the proverbial needle in a haystack,” what she describes as “an existential necessity on the Internet.” 

While the Court’s decision may still be months away, the Justices don’t seem eager to rewrite the rules regulating the internet. But during the course of arguments, it became clear that there were some concerns about Section 230. (Kagan, for example, questioned why internet websites get “a pass,” while other sectors of the economy are required to “internalize the costs of misconduct.”) Rather, the Justices seem to indicate that if anyone should alter Section 230, it should be the branch of government that originally wrote it: as Justice Brett Kavanaugh said, it may be better “to put the burden on Congress” to amend Section 230. 

The performance by plaintiffs’ counsel, as described by legal experts, didn’t appear to sway the justices either. “I don’t know if I’ve ever seen lawyers do so much damage to their own cases,” Tim Wu, the former US Special Assistant to the President for Technology and Competition Policy tweeted Tuesday. “Schnapper for petitioner was way out of his league and threw away every lifeline threw to him. Painful to watch such a nationally important issue be so badly argued.”

No matter what happens, in the end, the plaintiff’s appeal to reform Section 230 may be irrelevant—at least for the time being. On Wednesday, the Supreme Court is hearing yet another case regarding the internet, in which they must decide whether platforms like Twitter and Google can be sued under the Antiterrorism Act at all. If the Justices were to decide the answer is no, the arguments about Section 230 in Gonzalez complaint, as currently written, wouldn’t matter. 

DONALD TRUMP & DEMOCRACY

Mother Jones was founded to do journalism differently. We stand for justice and democracy. We reject false equivalence. We go after stories others don’t. We’re a nonprofit newsroom, because the kind of truth-telling investigations we do doesn’t happen under corporate ownership.

And we need your support like never before, to fight back against the existential threats American democracy faces. Fundraising for nonprofit media is always a challenge, and we need all hands on deck right now. We have no cushion; we leave it all on the field.

It’s reader support that enables Mother Jones to report the facts that are too difficult, expensive, or inconvenient for other news outlets to uncover. Please help with a donation today if you can—even a few bucks will make a real difference. A monthly gift would be incredible.

payment methods

DONALD TRUMP & DEMOCRACY

Mother Jones was founded to do journalism differently. We stand for justice and democracy. We reject false equivalence. We go after stories others don’t. We’re a nonprofit newsroom, because the kind of truth-telling investigations we do doesn’t happen under corporate ownership.

And we need your support like never before, to fight back against the existential threats American democracy faces. Fundraising for nonprofit media is always a challenge, and we need all hands on deck right now. We have no cushion; we leave it all on the field.

It’s reader support that enables Mother Jones to report the facts that are too difficult, expensive, or inconvenient for other news outlets to uncover. Please help with a donation today if you can—even a few bucks will make a real difference. A monthly gift would be incredible.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate

We Noticed You Have An Ad Blocker On.

Can you pitch in a few bucks to help fund Mother Jones' investigative journalism? We're a nonprofit (so it's tax-deductible), and reader support makes up about two-thirds of our budget.

We noticed you have an ad blocker on. Can you pitch in a few bucks to help fund Mother Jones' investigative journalism?