Facebook May Be High Tech, But It Uses an Old Fashioned Strategy to Undercut Bad News

The company’s touted “commitment to transparency” hasn’t stopped it from making disclosures when users are distracted.

Aurelien Meunier/Getty Images

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

Facebook announced last week that it had launched a lawsuit against Rankwave, a South Korean analytics company, over its potential mishandling of user data.

It’s a new scandal with similarities to the platform’s interactions with the British research firm Cambridge Analytica, perhaps Facebook’s biggest controversy to date. That seems like something that would be major or at least significant news. But the lawsuit, despite being covered by serious, national outlets, hasn’t so far caught much traction—and that’s in part by the company’s design. Facebook announced the lawsuit at around 6 pm Eastern Time last Friday night when many journalists had already left the office, when many news readers had already checked out for the weekend, and when major stock markets would be closed for at least two days, insulating the company’s share price. The result? All the but the most plugged-in observers of the company missed the news.

Putting out news on Friday nights, amid larger national stories and at other odd, inopportune moments has become a go-to move for the company. With each such news dump, Facebook claims transparency when it knows the fewest number of people will be around to see it.

On April 18, the Thursday before both a holiday weekend and the highly anticipated release of the Mueller report which was slated to command the news cycle, Facebook updated several lines in an old post to officially, and very quietly, reveal it accidentally had stored millions of Instagram passwords in readable, unencrypted text. The update was on a post from the month prior already admitting that the company had made a similar security mistake with 600 million Facebook passwords.

Several months before, Facebook released a long awaited report admitting that it failed to keep use of its platform from exacerbating deadly violence in the ongoing genocide in Myanmar. While the admission was important, Facebook made it on the eve of the November 2018 midterm elections, a moment when most journalists and news observers would be distracted.

Later that month, on the day before Thanksgiving, Facebook’s chief operating officer Sheryl Sandberg and its then head of communications, Elliot Schrage, published a post admitting that they had hired Definers, a right wing public relations firm. Facebook had received criticism for working with the firm after the New York Times exposed the company’s relationship with Definers and its attempts to link Facebook opponents to liberal financier George Soros. The billionaire has been the target of anti-semitic smears.

Sandberg initially said she wasn’t aware of Definers being hired by Facebook, but backtracked in the Thanksgiving eve news dump.

The Definers affair wasn’t the first time the company announced bad news near a major holiday. One year earlier, also just before Thanksgiving, Facebook announced plans for a feature that would allow users to check if they had seen content created and spread by Russian trolls attempting to influence U.S. politics. Facebook then released the tool the Friday before Christmas.

The company’s admissions of its errors in the “Newsroom” section of its site are usually contained in posts with unspecific titles like “Keeping Passwords Secure” and “Enforcing Our Platform Policies” that could make it seem like the company is taking credit for good news, instead of admitting its mistakes.

Facebook rejects the idea that it engages in news dumps or builds “artificial delays” in its news releases. Instead, a spokesperson said the company is always “working to share information with the public as soon as [it] can.”

“The reality is that we work as hard as we can to be as transparent as we can. We are a global company and make news announcement at any point of the week,” a Facebook spokesperson said. “We do not subscribe to the news dump strategy and there’s countless examples of that.”

The company pointed out a handful of such examples including an announcement last year that it would remove over 10,000 accounts for violating its community standards on a Tuesday in June, another that it would shut down several apps because of “low usage” on a July Monday last year, among several other non-Friday and holiday announcements. But almost all of the highlighted releases failed to address with major topics like privacy, disinformation, or legal action.

Critics like Jason Kint, CEO of the media trade association Digital Content Next, aren’t entirely convinced that Facebook isn’t plotting its news releases. 

“The strategy is clearly to get out a message that they can control and make sure they have the least amount of eyeballs on it,” said Kint.

He thinks this type of behavior is “typical of companies that have too much power.”

“You can draw a strong parallel between how they act with the press and the way that they show up for hearings,” he said, referencing Facebook’s frequent resistance to attending legislative panels. While the company has ultimately ceded to lawmaker demands and participated in a significant number of hearings after initial resistance, in some cases, as with the U.K. parliament, the company has outright rejected demands for Zuckerberg to appear.

While news dumps can and are a normal corporate communications strategy, Facebook’s use of them to dampen its many public missteps has become obvious, and risks contradicting the company’s repeatedly touted “commitment to transparency.”

Lindsey Barrett, a staff attorney and teaching fellow at Georgetown University’s Communications and Technology Clinic, says that Facebook’s news dumps are puzzling because they contain information the company could use to win back some public trust.

“It’s difficult to take their efforts as well-intentioned when every time Facebook discloses news it seems like the aim is to provide as little as possible, in as surreptitious a way as possible,” she says.

Fact:

Mother Jones was founded as a nonprofit in 1976 because we knew corporations and billionaires wouldn't fund the type of hard-hitting journalism we set out to do.

Today, reader support makes up about two-thirds of our budget, allows us to dig deep on stories that matter, and lets us keep our reporting free for everyone. If you value what you get from Mother Jones, please join us with a tax-deductible donation today so we can keep on doing the type of journalism 2024 demands.

payment methods

Fact:

Today, reader support makes up about two-thirds of our budget, allows us to dig deep on stories that matter, and lets us keep our reporting free for everyone. If you value what you get from Mother Jones, please join us with a tax-deductible donation today so we can keep on doing the type of journalism 2024 demands.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate