Facebook is toxic. And its leaders no longer give a crap about even appearing to be otherwise.
The reporting about Facebook this month has been brutal and disturbing. Here’s how the Wall Street Journal describes its “Facebook Files,” series, based on internal Facebook documents:
Facebook Inc. knows, in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands. That is the central finding of a Wall Street Journal series, based on a review of internal Facebook documents, including research reports, online employee discussions and drafts of presentations to senior management.
Time and again, the documents show, Facebook’s researchers have identified the platform’s ill effects. Time and again, despite congressional hearings, its own pledges and numerous media exposés, the company didn’t fix them. The documents offer perhaps the clearest picture thus far of how broadly Facebook’s problems are known inside the company, up to the chief executive himself.
In separate articles, the Journal showed that:
- It moderates famous and prominent people with a different set of rules from what it uses for other people, purely to avoid “PR fires” — even when those people engage in rule-violating activities like “revenge porn.”
- According to Facebook’s own internal documents, its Instagram social network makes “body image issues worse for one in three teen girls,” but Facebook hasn’t done anything about an algorithm that leads to anorexia and depression.
- Its algorithm rewards outrage, elevating and exacerbating the conflict between the most extreme elements of politics and society.
- In developing countries, Facebook takes only the weakest of steps to prevent drug cartels and human traffickers from using its platforms.
- Publicly, it promoted efforts to get members to take the Coronavirus vaccine, but at the same time, its platform became a hotbed for spreading false and misleading antivax comments.
The company is under attack from investigative journalists with access to leaked documents. In the past, Facebook would have at least paid lip service to the idea that it was addressing the problem. But now, it is apparently ready to embrace its role as a vector to spread the worst ills of society. An article in the New York Times reveals how Facebook will now use its own platform to spread feel-good stories about itself, an effort called “Project Amplify.”
Project Amplify punctuated a series of decisions that Facebook has made this year to aggressively reshape its image. Since that January meeting, the company has begun a multipronged effort to change its narrative by distancing Mr. Zuckerberg from scandals, reducing outsiders’ access to internal data, burying a potentially negative report about its content and increasing its own advertising to showcase its brand.
The moves amount to a broad shift in strategy. For years, Facebook confronted crisis after crisis over privacy, misinformation and hate speech on its platform by publicly apologizing. Mr. Zuckerberg personally took responsibility for Russian interference on the site during the 2016 presidential election and has loudly stood up for free speech online. Facebook also promised transparency into the way that it operated.
But the drumbeat of criticism on issues as varied as racist speech and vaccine misinformation has not relented. Disgruntled Facebook employees have added to the furor by speaking out against their employer and leaking internal documents. Last week, The Wall Street Journal published articles based on such documents that showed Facebook knew about many of the harms it was causing.
So Facebook executives, concluding that their methods had done little to quell criticism or win supporters, decided early this year to go on the offensive, said six current and former employees, who declined to be identified for fear of reprisal.
The company is no longer apologizing for its role in spreading lies, hate, conflict, and crime. It has also cut off access to data that researchers were using to measure how negative content spreads online. Facebook is circling the wagons. It no longer even wants to appear to give a crap about the effect of its algorithms on society.
In an epic series of tweets, the author and activist Cory Doctorow skewered the company and its effects on journalism and society.
Here’s my takeaway.
Facebook is one of the most evil companies on the planet. It creates the conditions to make the world meaner, worse, less safe, and more laden with conflict, and then does the absolute minimum to deal with the backlash.
Facebook will not change. Change is hard. It’s a lot easier to deal with these challenges through lobbying, negotiating, and spin, than to fix what is essentially unfixable — an algorithm that amplifies the worst of society.
In the past I called for regulating Facebook. That time has passed.
I now call on the US, EU, and any other government to crush Facebook.
Break it up into pieces.
Require it to open its data to regulators and academics for study.
Create onerous regulations that will be expensive to address and will cripple the company’s business model.
I don’t care so much what the justification is. I don’t care in much detail what the implementation is. It’s time to push back. Facebook can no longer keep doing what it is doing — it’s bad for the world, and it must stop.
Share this on Facebook
I’m not quitting Facebook. My quitting would make no difference.
But I am curious what will happen with this piece.
I’m going to ask you right now to share this article on Facebook. I’m curious to see if it actually appears on the news feed of my friends, or if it somehow gets less visibility than it should.
If you’re on Facebook, post a link to this article where your friends will see it. Then comment here and let me know. I’ll be watching.