The Supreme Court’s Big Algorithm Fail

“These are not like the nine greatest experts on the Internet.”

Alex Wong/Getty

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

Last Tuesday, as the Supreme Court began hearing oral arguments in key cases that could shape the future of the internet, it quickly became clear that some important information was missing. No one seemed able to explain how the algorithms that underlie the Internet actually function.

“These are not like the nine greatest experts on the Internet,” Justice Elena Kagan admitted to laughter during the hearing. The day’s arguments would prove her self-deprecating point. And the judges’ lack of understanding—a failure to grasp how code can discriminate and cause harm—could determine the case’s outcome. 

The case, Gonzalez v. Google, is the first time the US Supreme Court will consider limiting the broad legal immunity online platforms have enjoyed thanks to Section 230 of the 1996 Communications Decency Act. Section 230 emerged in the early days of the internet, offering online providers immunity from lawsuits about third-party content. For example, if a user posted illegal content on an AOL message board, under section 230, AOL was not liable for that content—the person who posted it was. In nearly three decades since the law was passed, section 230’s immunity has been broadly interpreted by courts. Today, it is why platforms can not only host misinformation and other types of toxic content, but also promote it through their algorithms without being held responsible.

The facts in Gonzalez v. Google concern whether Google, the parent company of YouTube, can be held liable for its role in disseminating ISIS videos under a law that forbids aiding and abetting terrorists. The suit, brought by the family of Nohemi Gonzalez, an American student killed in ISIS’s 2015 Paris attacks, claims that the company should lose section 230’s protection because it went beyond simply warehousing content by helping promote the videos.

This is where ignorance about algorithms came into play. “I was ripping my hair out, especially at the beginning of the oral arguments,” says Grant Fergusson, an attorney at the Electronic Privacy Information Center, over the justices’ and the lawyer for the Gonzalez family’s repeated descriptions of YouTube’s algorithm as “neutral.” 

The term “doesn’t obviously reflect the way that algorithms actually function in the world,” explains Fergusson, who helped write EPIC’s amicus brief in the case, asking the court to rein in Section 230 immunity. 

In fact, algorithms are not neutral, unbiased tools that treat all content and all users the same. The inner workings of YouTube’s own algorithm are proprietary secrets. But it’s no secret that algorithms are generally built on datasets that reflect society’s biases and that, either by direction or self association, can create new harms.

Fergusson points out that algorithms like YouTube’s, meant to drive engagement and keep users on the site, actually change how users behave. They also incentivize content creators to not only post, but to design content that drives engagement. YouTube’s purportedly “content neutral” algorithms have a history of pushing users toward more radical videos because the algorithm had learned that some content—extreme content—has proved more engaging.

Thomas McBrien, another EPIC attorney who contributed to the group’s amicus brief, explains that the justices were myopically using the word neutral to mean “content agnostic.” 

This was apparent from the get-go. The very first question, from Justice Clarence Thomas, examined whether a platform could be held liable for software that treats cooking demos and ISIS propaganda equally. “Are we talking about the neutral application of an algorithm that works generically for pilaf and also works in a similar way for ISIS videos?” he asked. The lawyer for the Gonzalez family, Eric Schnapper, agreed that YouTube uses the same algorithm for all content and that it is therefore “neutral.”

In a 2008 case, the Ninth Circuit Court of Appeals held that “neutral tools” should receive Section 230 immunity. While the justices aren’t bound by that lower court precedent, Chief Justice John Roberts seemed to find its reasoning appealing: If the algorithm is the same for every subject, he said, “it might be harder for you to say that there’s selection involved for which they could be held responsible.”

The most enduring image from the oral arguments was the analogy of a bookstore. Chief Justice Roberts laid out the question: Someone walks in looking for a book on Roger Maris. The bookseller sends them to the sports section. The seller isn’t liable for the content of the books they’ve just recommended. Why is that different from what happens when someone searches for ISIS on YouTube and the platform puts forward an array of videos that might respond to that request?

Schnapper reworked the analogy to argue for the platform’s liability: “It’s as if I went into the bookstore and said, ‘I’m interested in sports books,’ and they said, ‘We’ve got this catalogue which we wrote of sports books, sports books we have here,’ and handed that to me. They created that content.” In this analogy, the catalogue is the algorithm, and as a piece of content made by the company, should not be protected by Section 230. 

The bookstore analogy was repeated widely in the press coverage of the case. But the analogy is maddening. Likening an algorithm to a dead-tree catalogue paints a simplistic picture that hides the awesome power of algorithms. A more accurate analogy would be when someone who goes into a bookstore and asks for a book about Roger Maris, they are presented with a personalized catalogue based not just on the query but also that browser’s age, race, gender, income, the zipcode where they live, their entire search history, and much more. Then, as the customer wanders the shelves, the bookseller is constantly handing them a new catalogue based on what books they are glancing at.  

Justice Neil Gorsuch asked the only question that seemed to be informed by a complete picture of how algorithms work: “Is an algorithm always neutral?” he wondered. “Don’t many of them seek to profit-maximize or promote their own products? Some might even prefer one point of view over another.” But his question, almost rhetorical, was buried in a longer exchange, and no one responded to it. 

The question facing the court, whether YouTube recommendations produced by algorithms are shielded from liability under Section 230, does not necessarily hinge on how algorithms work. Algorithms are not mentioned in the law and the court could rule that they are therefore not protected. But the failure to accurately reflect the harms many algorithms cause is a failure to frame the stakes of the case. 

Google and its supporters in this case describe Section 230 as the pillar that holds up the modern internet. Should the justices rein in Section 230 immunity, they warn, the liabilities that would be unleashed would cause the Internet’s sky to fall. The justices were clearly perturbed by this chaotic possibility. But what wasn’t conveyed was the way that the modern internet, driven by profit-maximizing algorithms that developed under the protections of Section 230, do hurt people. From spreading deadly disinformation to violating people’s civil rights, maintaining the status quo is not necessarily the least harmful option—but for corporations like Google, it is the most profitable.

DONALD TRUMP & DEMOCRACY

Mother Jones was founded to do journalism differently. We stand for justice and democracy. We reject false equivalence. We go after stories others don’t. We’re a nonprofit newsroom, because the kind of truth-telling investigations we do doesn’t happen under corporate ownership.

And we need your support like never before, to fight back against the existential threats American democracy faces. Fundraising for nonprofit media is always a challenge, and we need all hands on deck right now. We have no cushion; we leave it all on the field.

It’s reader support that enables Mother Jones to report the facts that are too difficult, expensive, or inconvenient for other news outlets to uncover. Please help with a donation today if you can—even a few bucks will make a real difference. A monthly gift would be incredible.

payment methods

DONALD TRUMP & DEMOCRACY

Mother Jones was founded to do journalism differently. We stand for justice and democracy. We reject false equivalence. We go after stories others don’t. We’re a nonprofit newsroom, because the kind of truth-telling investigations we do doesn’t happen under corporate ownership.

And we need your support like never before, to fight back against the existential threats American democracy faces. Fundraising for nonprofit media is always a challenge, and we need all hands on deck right now. We have no cushion; we leave it all on the field.

It’s reader support that enables Mother Jones to report the facts that are too difficult, expensive, or inconvenient for other news outlets to uncover. Please help with a donation today if you can—even a few bucks will make a real difference. A monthly gift would be incredible.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate