Why Mark Zuckerberg’s Plea for Regulation is an Enormous Cop-Out

The social media giant follows an old tech playbook: Break things and let someone else clean up.

Abdulhamid Hosbas/Anadolu Agency/Getty

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

In October, when the Trump campaign began running a slew of ads spreading falsehoods about then–Democratic presidential frontrunner Joe Biden and his son’s business dealings in Ukraine, Facebook opted to do nothing, citing policies allowing politicians to include false claims in political ads that they pay the platform to spread.

Since then, the company has repeatedly suggested that it would like to take action against such content, but won’t until the government steps in and offers regulations that would be uniform across the industry. On Saturday, company CEO Mark Zuckerberg reinforced this stance at the Munich Security Conference, saying that “we don’t want private companies making so many decision-balancing social equities without democratic processes. I do think that there should be regulation in the West on harmful content…there’s a question about which framework you use.” 

The argument neatly fits a longstanding pattern of tech companies wading into new, potentially destructive areas seeking revenue, only to follow up by saying they’ll punt on making difficult ethical decisions that could hurt their bottom line until governments step in and force them to.

Though last year Twitter banned all political advertising, and Google banned targeted political advertising, just last month, Facebook reiterated that it would not follow those major rivals, and would continue to let politicians lie in political advertisements. At the time, Rob Leathern, Facebook’s director of product management, explained the company’s reasoning: “Ultimately, we don’t think decisions about political ads should be made by private companies, which is why we are arguing for regulation that would apply across the industry,” he wrote. “Frankly, we believe the sooner Facebook and other companies are subject to democratically accountable rules on this the better.”

While its deeply ironic the platform says that it’s interested in being “democratically accountable” as it has created an apparatus so powerful that it has in effect become its own digital government whose rules are subject to no vote and accountable to no constituency, the company isn’t the first tech giant to pursue profit while largely outsourcing responsibility.

In December 2018, for example, Microsoft President Brad Smith wrote a post acknowledging the potential for his company’s facial recognization products to exacerbate discrimination and erode privacy, and asking the government to step in. “We don’t believe that the world will be best served by a commercial race to the bottom, with tech companies forced to choose between social responsibility and market success,” he wrote.

Uber, Lyft, and Airbnb pioneered an aggressive version of this playbook by strong-arming their way into new cities, ignoring existing taxi and hotel laws and reshaping the business environment while fending off regulators. Only after that would they typically agree to work with cities on some type of regulatory agreement.

The strategy lets technology companies get away with committing a sort of regulatory arbitrage where they figure out how to make money by using technology to circumvent laws in ways that were extremely difficult for policymakers to anticipate back when the existing legal frameworks were created. Uber has created urban traffic congestion, dragged its feet on background checks for its drivers (which might have contributed to it missing drivers who have sexually assaulted riders), and figured out how to pay its workers effectively below minimum wage.

The mess that Facebook is making and asking lawmakers to clean up isn’t all that different from Uber blowing past municipal regulations and then leaving lawmakers to figure out how to address all the new issues it created. But instead of messing with urban planning, Facebook has enabled the flow of disinformation and misinformation, creating new means to mislead and manipulate people.

Of course, governments have enough to do, and don’t need multibillion-dollar corporations offloading new public problems onto them. Sometimes this is inevitable with new technologies. But companies like Uber, Facebook, and Microsoft have made it part of their business models: Do whatever you want and leave regulators to work out the potential kinks.

It’s possible to strike a balance between asking for regulation and making difficult, principled decisions in the meantime. After Microsoft asked for the government to step in on federal regulation, it denied at least one law enforcement agency’s request to use facial recognition software in cars and body cameras. Twitter’s and Google’s recent actions show that companies can make changes to political advertising policies while waiting for lawmakers to come through on their promises of regulating them.

If government does move toward greater regulation of online political advertisement, it’s not as though Facebook is going to sit back and just watch the democratic process play out. It will send its army of well-paid lobbyists to shape the policy. This way, Facebook still likely gets a strong say in what happens, but gets to say that it let the government do it. Facebook can also rest assured that the government’s hands will be more tied than its own: Lawmakers would have to contend with constitutional restrictions that Facebook, as a private company, could ignore.

Facebook, whether intentionally or not, gave a deeper glimpse into its rationale behind sticking to its current positions last month when Facebook Vice President Andrew Bosworth said in a memo leaked to the New York Times that not changing Facebook’s current policies “may very well” lead to Trump winning the 2020 presidential election. Knowing this, he said, “I find myself desperately wanting to pull any lever at my disposal to avoid the same result,” but that he wouldn’t out of a sense of responsibility to be fair and impartial.

What Bosworth, and maybe all of Facebook, fails to realize is that inaction to maintain the status quo is a form of partiality. Specifically, in the context of political advertising, it’s a form of partiality that clearly benefits the side that is more willing to lie and use other underhanded tactics—a side that, in 2020, is easy to identify. Putting aside his campaign, in the last presidential election Trump alone told some version of a lie or “untruth” every 3.25 minutes according to analysis conducted by Politico in 2016.

While Facebook has championed transparency as a tool to mitigate the worst effects of harmful political ads, by leaving its political ad targeting tools intact, groups can spread misleading or outright false information on the platform without anyone noticing. Alex Stamos, Facebook’s former chief security officer, disagrees with Facebook’s inaction and has explained the problems with its transparency in an interview with the Columbia Journalism Review’s Mathew Ingram. “In theory, the transparency efforts by a handful of companies should reduce the impact of micro-targeting, since those ads are no longer secret from the opposing party and media,” Stamos said. “In practice, we are likely to see tens of thousands of A/B tested and programmatically generated ads out of the Trump campaign in 2020 and the ability for the media or his opponent to correct the record when each is only seen by a tiny number of people is limited.” In other words, no matter how much transparency Facebook provides, the zone will be so flooded that most ads, deceptive or not, will never be parsed.

Wanting regulation is fine, but doing nothing in the meanwhile isn’t the same as being impartial. It’s tipping the scales and impeding the mechanisms of “democratic accountability” that Facebook claims it wants to maintain.

Fact:

Mother Jones was founded as a nonprofit in 1976 because we knew corporations and billionaires wouldn't fund the type of hard-hitting journalism we set out to do.

Today, reader support makes up about two-thirds of our budget, allows us to dig deep on stories that matter, and lets us keep our reporting free for everyone. If you value what you get from Mother Jones, please join us with a tax-deductible donation today so we can keep on doing the type of journalism 2024 demands.

payment methods

Fact:

Today, reader support makes up about two-thirds of our budget, allows us to dig deep on stories that matter, and lets us keep our reporting free for everyone. If you value what you get from Mother Jones, please join us with a tax-deductible donation today so we can keep on doing the type of journalism 2024 demands.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate