Resignation letter from Facebook engineer Ashok Chandwaney undermines the company’s rationalizations

Ashok Chandwaney. Photo by Chona Kasinger/Washington Post

When it comes to hate, does Facebook believe in what it says it does? According to engineer Ashok Chandwaney, not even close. The Washington Post published the departing employee’s resignation letter, which had been posted on Facebook’s internal message board. Let’s take a close look at the accusations.

Analyzing Chandwaney’s resignation letter

The resignation letter is visible here. Analysis shown below is my opinion. Items set off with ## are in the original, and represent Facebook’s values, which, as Chandwaney clearly reveals, the company doesn’t actually follow.

# After nearly 5 1/2 years, today is my last day at Facebook

*** I’m quitting because I can no longer stomach contributing to an organization that is profiting off hate in the US and globally.***

I want to thank all of the people – contractors, interns, and FTE [full time] – who I’ve met here over the years, for helping create a pleasant and mutually respectful workplace. It is clear to me that despite the best efforts of many of us who work here, and outside advocates like Color Of Change, Facebook is choosing to be on the wrong side of history.

As I reflect on my last five years, Facebook’s five core values rise to the top of my mind. I’m going to share what I’ve learned from them, and how the absence of them in the company’s approach to hate has eroded my faith in this company’s will to remove it from the platform.

Read the statement set off by the ***. It is clear, direct, and unequivocal — a dramatic lead sentence. Take note: all of your communications should start this directly. This letter is about hate, and the failure of Facebook to do anything about it on the platform.

Note also that Chandwaney states that Facebook is choosing to be on the wrong side of history. This is not a random side effect, it is a choice. I agree: failing to act decisively is a choice.

## Be Bold

Facebook didn’t scale to over 2.5 billion users, a third of the world’s entire population, by us throwing our hands up when faced with a challenge, saying “it’s too hard”, and walking away. Quite the opposite: my career at Facebook has been defined by confronting hard problems head on.

Often, I hear people explain how hard it is to do things like remove hate content, stop hate organizing, or etc. To me being bold means seeing something that’s hard to do but, knowing it’s the right thing to do, rolling up my sleeves, and diving in.

Boldness is not, on the other hand, taking a pass on implementing the recommendations from organized civil rights advocates, eg #StopHateForProfit, and even our own civil rights auditors – as we have done.

Given the lack of willingness, commitment, urgency and transparency around actioning the civil rights audit’s recommendations to the best of our ability, I am left wondering if the audit was intended to be a PR deflection strategy.

This refers to the civil rights audit that Facebook commissioned and published in July. Chandwaney thinks not enough has happened since then. I agree. Action is expensive and complicated — Facebook doesn’t want to to address it.

## Focus on Impact

I’ve learned to pay relentless attention to the results of my work, and that outcomes as measured by fair, honest metrics are what matters.

As everyone should know after the Myanmar Genocide, “the looting starts the shooting starts“, the Kenosha Guard shootings, and countless incidents in between: our work has life and death consequences.

Every day “the looting starts, the shooting starts” stays up is a day that we choose to minimize regulatory risk at the expense of the safety of Black, Indigenous, and people of color.

Violent hate groups and far-right militias are out there, and they’re using Facebook to recruit and radicalize people who will go on to commit violent hate crimes. So where’s the metric about this? Our PR response to #StopHateForProfit on this one didn’t even engage with the question.

If you can’t measure it, you can’t manage it. Facebook has no publicized metric about hate and its consequences, which is a big part of why there is no measurement of any putative progress — just a bunch of cases where hate on the platform has led to harassment and death in the world.

## Move Fast

I’ve been told repeatedly “Facebook moves much faster than {company x}”. In my work, moving fast looks like bias to action: when presented with a problem, I execute towards a solution with haste. Sometimes this has meant learning about a bug in a meeting, and fixing it before the meeting is over.

The contrast between that and our approach to hate on platform is astonishing. Civil society has been engaging with Facebook on issues like whether “white nationalism” is hate content (first reported in 2018, enforcement is dubious), around preventing illegal discrimination in ads (still possible as of December), and [refusing to take good faith steps to reduce hate on platform).

Feedback is supposed to be a gift, yet despite the enormous feedback (and multiple lawsuits, for discriminatory ads) very little action has been taken. In fact, we continue to pass the buck with the Kenosha Guard failure being pinned on contract content moderators, who are underpaid and undersupported in their jobs – both of which are things Facebook could almost instantly fix if it so chose. The actions that have been taken are easy, and could be interpreted as impactful because they make us look good, rather than impactful because they will make substantive change.

From an ex-engineer, this is devastating. Chandwaney knows the right thing to do, and knows Facebook has chosen not to do it. In other cases, Facebook moves fast and deals with the consequences later — but when it comes to hate, their moves are timid and slow.

## Be Open

I’ve learned to engage honestly and eagerly with folks who want to have conversations with me at work, regardless of role or team.

The lack of openness on Facebook’s part when it comes to the matter of hate on platform throws this idea into sharp relief. After it came out that the extreme-right, racist Breitbart gets a pass on our misinformation policies, the company’s response was to hide the receipts. Our dishonesty about the Kenosha shootings is similarly, uh, not very open.

Pretty nakedly hypocritical, with sources cited.

## Build Social Value

To this day, the meaning of this value escapes me. I’ve heard numerous, unsatisfying explanations for how the various things I’ve worked on here has been building social value.

In all my roles across the company, at the end of the day, the decisions have actually come down to business value. What I wish I saw were a serious prioritization of social good even when there isn’t an immediately obvious business value to it, or when there may be business harm that comes from it – for instance, removing the sitting president’s incitement to violence, which could lead to regulatory action.

It seems that Facebook hasn’t found the business value to be had in aggressively pursuing the existing credible strategies to remove hate from the platform – despite pressure from civil society, our own employees, our own consultants, and our own customers via the boycott.

If none of those things can compel us to be bold and move fast on hate, it seems like the only source of pressure that’s yet to come to bear is government or regulatory action. While I know many of us groan at the idea of government intervention of any sort, this is an approach that has seen a marked reduction in hate content on German social media.

Again, Chandwaney is right. Facebook’s failure to act is going to bite it in the butt — and the only question is, under which American president will the regulatory action take place. Hiding from the company’s responsibility and failing to make its position clear to employees will make the situation worse.

## This is your company now

I know I’m not alone in being upset about Facebook’s willingness to profit off of hate. If you feel alone in that, and want someone to chat (about non-confidential things!) with, hit me up on LinkedIn and we can get on the phone. I’m gonna have a lot of free time on my hands for now.

PS: just in case it’s not clear, I do assume – as required by policy – best intent of all my coworkers including leadership. It’s just, I can’t point to facts that substantiate that assumption when looking at our repeated failures to confront the hate and violence occuring and being organized on platform.

Two lessons from this powerful letter

Firstly, Chandwaney’s communication is among the clearest I’ve ever seen in a moment of conflict. It lays out the case clearly up front, and the comparisons of actions to Facebook’s values are devastating (and clearly documented). It is negative but not emotional. If you have a difficult letter to write, you can learn from this example.

Secondly, consider what would happen if all of Facebook’s engineers were this forthcoming. The company would have to change. Engineers like these are not easy to replace.

It is clear from this (and many other actions and reports) that Facebook is operating from a position of fear. The company is afraid of what will happen if it stands up to hate as, for example, Twitter has. If it were to do the right thing, many, many people would be upset — and some of those are regulators and a president with a huge Twitter following. Mark Zuckerberg wants to make sure that no policy interferes with the elegance of the almighty algorithm.

It’s not clear how well that will work if the failure to act allows the country to tear itself apart.

5 responses to “Resignation letter from Facebook engineer Ashok Chandwaney undermines the company’s rationalizations

  1. Really appreciate your bringing this so forcefully to my attention. Facebook has morphed into an abomination as a result of greed’s grotesque propensity to proliferate. I so applaud the moral courage Chandwaney epitomizes. The world desperately needs heroes like this to stand up for what is right in the long term instead of what is profitable today.

    1. I just connected with Ashok on Linkedin. I thanked him for his courageous decision. He’s completely right: Zuck and company don’t care at all about addressing the problem. Kudos to him and other employees unwilling to tolerate the status quo. If I were hiring a software engineer, I’d gobble him up not only for his skills, but for his sense of right and wrong.

  2. If he codes as well as he writes, then he is an amazingly accomplished and versatile employee. Add in the clarity and understanding of morality, and he is an amazing human. I hope he starts his own company because I’m not sure any large corporation could handle this much honesty without trying to squash it.

  3. Gee, facebook is a “free” social communication platform, so naturally all kinds of people are going to use it, especially ones that can BENEFIT more that “social gossip”. There are groups who sell endangered animals in through their black market, probably there are groups who are thieves, gangs, etc. So, my opinion is “why should Facebook has to be police, prosecutor, etc for people “free speech”? I see compliants over HATE SPEECH towards human only. I never heard of violent protest over farm animals, fishes, or wildlife or any other living thing on Earth that are mass ‘MURDERED” for money. !!!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.