Read the Facebook Papers as the algorithm defending itself. Then you’ll understand.

We got an incredible window into Facebook yesterday. Adrienne LaFrance, executive editor of The Atlantic, published “History Will Not Judge Us Kindly,” after her exhaustive reading of the leaked internal documents known as the Facebook Papers.

LaFrance’s piece clarifies something that too many people are missing. Zuckerberg and his company do not serve users, shareholders, or employees. They are the handmaidens of the Facebook algorithm. Zuckerberg sees himself as the preserver and defender of that algorithm, and the internal and external forces arrayed against it (not against him, not against his company, but against it: the algorithm) as the enemy. And he is performing his duty, which to loyally serve the machine. He fundamentally believes the machine does more good than harm, which is why he does this.

Parsing LaFrance’s article from the algorithm’s point of view

I don’t have access to the leaked Facebook Papers — only a few news organizations do. But we get a good look into what’s there through LaFrance’s devastating review of those documents in The Atlantic.

To see what I mean, here are some key passages from LaFrance’s article, with a translation showing what they really mean from the algorithm’s point of view.

LaFrance wrote this about what happened Immediately after the January 6 insurrection:

“This is a dark moment in our nation’s history,” Zuckerberg wrote [on Facebook’s internal employee sharing system], “and I know many of you are frightened and concerned about what’s happening in Washington, DC. I’m personally saddened by this mob violence.”

Facebook staffers weren’t sad, though. They were angry, and they were very specifically angry at Facebook. Their message was clear: This is our fault.

Chief Technology Officer Mike Schroepfer asked employees to “hang in there” as the company figured out its response. “We have been ‘hanging in there’ for years,” one person replied. “We must demand more action from our leaders. At this point, faith alone is not sufficient.”

“All due respect, but haven’t we had enough time to figure out how to manage discourse without enabling violence?” another staffer responded. “We’ve been fueling this fire for a long time and we shouldn’t be surprised it’s now out of control.”

“I’m tired of platitudes; I want action items,” another staffer wrote. “We’re not a neutral entity.”

“One of the darkest days in the history of democracy and self-governance,” yet another staffer wrote. “History will not judge us kindly.”

Translation: Staffers are concerned that the algorithm is generating violence. Zuckerberg and Shroepfer tell them to hang tough, because the algorithm sometimes has harmful side effects.

Facebook has dismissed the concerns of its employees in manifold ways. One of its cleverer tactics is to argue that staffers who have raised the alarm about the damage done by their employer are simply enjoying Facebook’s “very open culture,” in which people are encouraged to share their opinions, a spokesperson told me. This stance allows Facebook to claim transparency while ignoring the substance of the complaints, and the implication of the complaints: that many of Facebook’s employees believe their company operates without a moral compass.

Translation: The employees complaining is part of the way the algorithm works. Ignoring their complaints serves the algorithm’s purpose.

Not only were the perpetrators live-streaming their crimes as they committed them, but federal court records show that those who have been indicted spent many weeks stoking violence on Facebook with posts such as “NO EXCUSES! NO RETREAT! NO SURRENDER! TAKE THE STREETS! TAKE BACK OUR COUNTRY! 1/6/2021=7/4/1776” and “Grow a pair of balls and take back your government!”

When you stitch together the stories that spanned the period between Joe Biden’s election and his inauguration, it’s easy to see Facebook as instrumental to the attack on January 6. (A spokesperson told me that the notion that Facebook played an instrumental role in the insurrection is “absurd.”) Consider, for example, the case of Daniel Paul Gray. According to an FBI agent’s affidavit, Gray posted several times on Facebook in December about his plans for January 6, commenting on one post, “On the 6th a f[*]cking sh[*]t ton of us are going to Washington to shut the entire city down. It’s gonna be insane I literally can’t wait.” In a private message, he bragged that he’d just joined a militia and also sent a message saying, “are you gonna be in DC on the 6th like trump asked us to be?” Gray was later indicted on nine federal charges, including obstruction of an official proceeding, engaging in acts of physical violence, violent entry, assault, and obstruction of law enforcement. He has pleaded not guilty to all of them. . . .

All over America, people used Facebook to organize convoys to D.C., and to fill the buses they rented for their trips. Facebook users shared and reshared messages like this one, which appeared before dawn on Christmas Eve in a Facebook group for the Lebanon Maine Truth Seekers:

This election was stolen and we are being slow walked towards Chinese ownership by an establishment that is treasonous and all too willing to gaslight the public into believing the theft was somehow the will of the people. Would there be an interest locally in organizing a caravan to Washington DC for the Electoral College vote count on Jan 6th, 2021? I am arranging the time off and will be a driver if anyone wishes to hitch a ride, or a lead for a caravan of vehicles. If a call went out for able bodies, would there be an answer? Merry Christmas.

The post was signed by Kyle Fitzsimons, who was later indicted on charges including attacking police officers on January 6. Fitzsimons has pleaded not guilty to all eight federal charges against him.

You may be thinking: It’s 2021; of course people used Facebook to plan the insurrection. It’s what they use to plan all aspects of their lives. But what emerges from a close reading of Facebook documents, and observation of the manner in which the company connects large groups of people quickly, is that Facebook isn’t a passive tool but a catalyst. Had the organizers tried to plan the rally using other technologies of earlier eras, such as telephones, they would have had to identify and reach out individually to each prospective participant, then persuade them to travel to Washington. Facebook made people’s efforts at coordination highly visible on a global scale. The platform not only helped them recruit participants but offered people a sense of strength in numbers. Facebook proved to be the perfect hype machine for the coup-inclined.

Translation: People forming communities and working together is part of the algorithm. Some of those communities lead to violence. That’s part of the design of the algorithm.

In April 2020, according to Frances Haugen’s filings with the SEC, Facebook employees had recommended tweaking the algorithm so that the News Feed would deprioritize the surfacing of content for people based on their Facebook friends’ behavior. The idea was that a person’s News Feed should be shaped more by people and groups that a person had chosen to follow. Up until that point, if your Facebook friend saw a conspiracy theory and reacted to it, Facebook’s algorithm might show it to you, too. The algorithm treated any engagement in your network as a signal that something was worth sharing. But now Facebook workers wanted to build circuit breakers to slow this form of sharing.

Experiments showed that this change would impede the distribution of hateful, polarizing, and violence-inciting content in people’s News Feeds. But Zuckerberg “rejected this intervention that could have reduced the risk of violence in the 2020 election,” Haugen’s SEC filing says. An internal message characterizing Zuckerberg’s reasoning says he wanted to avoid new features that would get in the way of “meaningful social interactions.” But according to Facebook’s definition, its employees say, engagement is considered “meaningful” even when it entails bullying, hate speech, and reshares of harmful content.

This episode, like Facebook’s response to the incitement that proliferated between the election and January 6, reflects a fundamental problem with the platform. Facebook’s megascale allows the company to influence the speech and thought patterns of billions of people. What the world is seeing now, through the window provided by reams of internal documents, is that Facebook catalogs and studies the harm it inflicts on people. And then it keeps harming people anyway.

Translation: Humans recommend fixing the algorithm to stop spreading hate and violence. Zuckerberg defends the algorithm at the expense of the people’s safety.

I’ve [LaFrance] been covering Facebook for a decade now, and the challenges it must navigate are novel and singularly complex. One of the most important, and heartening, revelations of the Facebook Papers is that many Facebook workers are trying conscientiously to solve these problems. One of the disheartening features of these documents is that these same employees have little or no faith in Facebook leadership. It is quite a thing to see, the sheer number of Facebook employees—people who presumably understand their company as well as or better than outside observers—who believe their employer to be morally bankrupt.

I spoke with several former Facebook employees who described the company’s metrics-driven culture as extreme, even by Silicon Valley standards. (I agreed not to name them, because they feared retaliation and ostracization from Facebook for talking about the company’s inner workings.) Facebook workers are under tremendous pressure to quantitatively demonstrate their individual contributions to the company’s growth goals, they told me. New products and features aren’t approved unless the staffers pitching them demonstrate how they will drive engagement. As a result, Facebook has stoked an algorithm arms race within its ranks, pitting core product-and-engineering teams, such as the News Feed team, against their colleagues on Integrity teams, who are tasked with mitigating harm on the platform. These teams establish goals that are often in direct conflict with each other.

Translation: The internal metrics-driven culture is how the algorithm protects itself. It manipulates employees to ensure that the code most likely to feed the algorithm’s “engagement” wins, over other possible changes that might be better for society.

These worries [about Facebook’s effect on society] have been exacerbated lately by fears about a decline in new posts on Facebook, two former employees who left the company in recent years told me. People are posting new material less frequently to Facebook, and its users are on average older than those of other social platforms. The explosive popularity of platforms such as TikTok, especially among younger people, has rattled Facebook leadership. All of this makes the platform rely more heavily on ways it can manipulate what its users see in order to reach its goals. This explains why Facebook is so dependent on the infrastructure of groups, as well as making reshares highly visible, to keep people hooked.

But this approach poses a major problem for the overall quality of the site, and former Facebook employees repeatedly told me that groups pose one of the biggest threats of all to Facebook users. In a particularly fascinating document, Facebook workers outline the downsides of “community,” a buzzword Zuckerberg often deploys as a way to justify the platform’s existence. Zuckerberg has defined Facebook’s mission as making “social infrastructure to give people the power to build a global community that works for all of us,” but in internal research documents his employees point out that communities aren’t always good for society:

When part of a community, individuals typically act in a prosocial manner. They conform, they forge alliances, they cooperate, they organize, they display loyalty, they expect obedience, they share information, they influence others, and so on. Being in a group changes their behavior, their abilities, and, importantly, their capability to harm themselves or others … Thus, when people come together and form communities around harmful topics or identities, the potential for harm can be greater.

The infrastructure choices that Facebook is making to keep its platform relevant are driving down the quality of the site, and exposing its users to more dangers. Those dangers are also unevenly distributed, because of the manner in which certain subpopulations are algorithmically ushered toward like-minded groups. And the subpopulations of Facebook users who are most exposed to dangerous content are also most likely to be in groups where it won’t get reported.

Translation: The algorithm needs growth. Growth demands engagement. Engagement demands prioritizing “community” even where it is harmful. So the algorithm gets what it wants: prioritizing community over all else.

I’ve [again, this is LaFrance writing] sometimes compared Facebook to a Doomsday Machine in that it is technologically simple and unbelievably dangerous—a black box of sensors designed to suck in environmental cues and deliver mutually assured destruction. When the most powerful company in the world possesses an instrument for manipulating billions of people—an instrument that only it can control, and that its own employees say is badly broken and dangerous—we should take notice.

The employees are worried. The world is worried. LaFrance, who has seen the internal documents, is very worried.

And Zuckerberg is still defending what he created. Because that’s what the algorithm needs him to do.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.