Disinformation Isn’t Just a Tech Problem. It’s a Social One, Too.

That’s one important takeaway from a new report that studies the “crisis of trust and truth.”

Universal Images Group via Getty Images

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

Mis- and disinformation are often viewed as a cause of society’s ills. But a new report from the Aspen Institute’s Commission on Information Disorder, which studied the global “crisis of trust and truth,” offers a different perspective on how to think about the proliferation of conspiracy theories and bogus info: The rise of disinformation is the product of long-standing social problems, including income inequality, racism, and corruption, which can be easily exploited to spread false information online.

“Saying that the disinformation is the problem—rather than a way in which the underlying problem shows itself—misses the point entirely,” the report quotes Mike Masnick, founder of Techdirt, as saying.

Disinformation, as the report’s authors explain, comes from “corporate, state actor, and political persuasion techniques employed to maintain power and profit, create harm, and/or advance political or ideological goals” and can exacerbate “long-standing inequalities and undermines lived experiences for historically targeted communities, particularly Black/African American communities.” Disinformation is a nascent and fast-moving field of study and Aspen’s report, released Monday, offers a point of view that departs from the conventional wisdom that information problems stem largely from tech and social media platforms. 

The Aspen commission, which was co-chaired by Katie Couric, Color of Change president Rashad Robinson, and former director of the Cybersecurity and Infrastructure Security Agency Chris Krebs, spent six months examining the causes of and solutions to disinformation. Advisors to the commission included Joan Donovan, research director at Harvard Kennedy School’s Shorenstein Center on Media, Politics and Public Policy; Nathaniel Gleicher, head of Cybersecurity at Meta (formerly Facebook), and Evelyn Douek, a lecturer at Harvard Law School. 

The report outlines goals and recommendations for addressing disinformation on social media, including amendments to Section 230, a frequently debated and often misunderstood part of the 1996 Communications Decency Act, which gives tech companies legal immunity when it comes to user-generated content posted on their platforms. The report proposes “withdraw[ing] platform immunity for content that is promoted through paid advertising and post promotion” and “remove[ing] immunity as it relates to the implementation of product features, recommendation engines, and design.” In other words, algorithmically boosted content and information would no longer be legally protected. 

The authors also urged the executive branch to take action on disinformation broadly, something that’s been controversial even among disinformation researchers and thinkers. They recommended the White House create a “comprehensive strategic approach to countering disinformation and the spread of misinformation” that includes “a centralized national response strategy.” 

Despite the breadth and recommendations of the report, its authors still noted the limitations of stopping disinformation. “To be clear, information disorder is a problem that cannot be completely solved,” they write. “Its eradication is not the end goal. Instead, the Commission’s goal is to mitigate misinformation’s worst harms with prioritization for the most vulnerable segments of our society.”

We've never been very good at being conservative.

And usually, that serves us well in doing the ambitious, hard-hitting journalism that you turn to Mother Jones for. But it also means we can't afford to come up short when it comes to scratching together the funds it takes to keep our team firing on all cylinders, and the truth is, we finished our budgeting cycle on June 30 about $100,000 short of our online goal.

This is no time to come up short. It's time to fight like hell, as our namesake would tell us to do, for a democracy where minority rule cannot impose an extreme agenda, where facts matter, and where accountability has a chance at the polls and in the press. If you value our reporting and you can right now, please help us dig out of the $100,000 hole we're starting our new budgeting cycle in with an always-needed and always-appreciated donation today.

payment methods

We've never been very good at being conservative.

And usually, that serves us well in doing the ambitious, hard-hitting journalism that you turn to Mother Jones for. But it also means we can't afford to come up short when it comes to scratching together the funds it takes to keep our team firing on all cylinders, and the truth is, we finished our budgeting cycle on June 30 about $100,000 short of our online goal.

This is no time to come up short. It's time to fight like hell, as our namesake would tell us to do, for a democracy where minority rule cannot impose an extreme agenda, where facts matter, and where accountability has a chance at the polls and in the press. If you value our reporting and you can right now, please help us dig out of the $100,000 hole we're starting our new budgeting cycle in with an always-needed and always-appreciated donation today.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate