Why I disagree with Facebook’s defenders

Last week, I said we need to crush Facebook. A lot of you agreed. But a few found ways to explain that Facebook is just a normal part of the internet, magnifying stuff that we find engaging, and so crushing it is an overreaction.

I respect these arguments and the people who make them. But I disagree. So let’s examine the arguments and ask if they justify Facebook’s existence.

Isn’t Facebook just an emotion amplifier?

Here’s what one commenter said:

I usually agree with you, Josh, but wonder if you are overreacting a bit. Facebook is in essence a human-emotion amplification platform, and its business model hinges upon keeping its users engaged. In other words, the very structure of FB — to make emotions stronger, and get people to share more — brings out some of the bad in people. I don’t like seeing misinformation on Facebook, or screeds and fights in the comments based on minor political differences, but I do feel users need to take accountability for the nastiness they share and the idiocy they can fall prey too.

Instagram has a different algo, one designed to encourage sharing image idealism, and that also pulls out the worst fakery in us and makes some of us feel bad; still, that’s the core structure of IG, an accelerant of desirability.

So I hear you, but perhaps another way to digest all this is … modern amplification platforms that bring out human nature maybe shouldn’t be blamed, because all they are doing is amplifying human nature. Peace brother.

The basic gist of this argument is that people are responsible for their own posts and responses, including ones that are idiotic, and human nature is to blame.

I agree with this commenter that Facebook’s “business model hinges upon keeping its users engaged. In other words, the very structure of FB — to make emotions stronger, and get people to share more — brings out some of the bad in people.”

But that’s where I see the problem: Facebook’s design amplifies conflict, anger, misinformation, lies, and division. I certainly know that these are part of what humans do. But I’m extremely concerned about a global, universal platform with billions of users that amplifies that negativity. I don’t think that’s an overreaction.

We have a history of holding platforms and amplifiers responsible for their acts and words.

If a newspaper quotes a source falsely denigrating another person, and the newspaper knows (or should have known) that the person they are quoting is lying, the newspaper can be sued for libel. But if Facebook hosts lies, it bears no consequences, due to its Section 230 protection. This is a choice we as Americans have made regarding our laws, and we can change it.

If a drug addict indulges their worse and most violent tendencies under the influence, we an certainly blame the addict — but we also attempt to outlaw and limit distribution of the drug, because of the damage that it inflames.

If you enable a foreign agent to cause damage to America, you can be prosecuted, even if the source of the problem is the foreign agent. But Facebook’s algorithms led to the spread of massive foreign propaganda during the 2020 election — the social network has proven impotent at stopping such activity.

If you build a pipe that discharges toxic sludge from a factory into the river, certainly the factory is at fault. But you are at fault as well, for channeling the discharge where it can do the most damage.

This article by Adrienne LaFrance, executive editor of The Atlantic, characterizes Facebook as a massive, autocratic, foreign power threatening the US — with no real check on its power. As she writes:

Facebook’s defenders like to argue that it’s naive to suggest that Facebook’s power is harmful. Social networks are here, they insist, and they’re not going anywhere. Deal with it. They’re right that no one should wish to return to the information ecosystems of the 1980s, or 1940s, or 1880s. The democratization of publishing is miraculous; I still believe that the triple revolution of the internet, smartphones, and social media is a net good for society. But that’s true only if we insist on platforms that are in the public’s best interest. Facebook is not.

Facebook is a lie-disseminating instrument of civilizational collapse. It is designed for blunt-force emotional reaction, reducing human interaction to the clicking of buttons. The algorithm guides users inexorably toward less nuanced, more extreme material, because that’s what most efficiently elicits a reaction. Users are implicitly trained to seek reactions to what they post, which perpetuates the cycle. Facebook executives have tolerated the promotion on their platform of propaganda, terrorist recruitment, and genocide. They point to democratic virtues like free speech to defend themselves, while dismantling democracy itself.

If you agree that Facebook is a “lie-disseminating instrument of civilizational collapse,” then we need to stop it. By we, I mean the United States government and the governments of other nations and unions of nations. When you have the highly liberal Elizabeth Warren and the far conservative Josh Hawley on the same page about this, there is clearly an opportunity to take regulatory and legislative action.

The fact that the negative emotions begin with a human is not the point. Yes, humans are capable of negative emotions. But society functions only when those negative emotions are not repeated, resonated, and amplified to the point where they destroy us.

But what about . . .

Here’s the other objection: as one commenter wrote, Facebook is not the whole problem, or even the worst of the problem:

FB is far from alone and not necessarily the most prominent in spreading disinformation.

How about YouTube – the second biggest search engine – and it’s role in spreading disinformation…
https://www.npr.org/2021/04/13/986678544/exploring-youtube-and-the-spread-of-disinformation

If one wants to dig into it more, I’m sure just about every platform can be found guilty of many of the same criticisms raised here.

Should we crush them all? Including blogs that are inflammatory?

The platforms are not without criticism, to be sure. But, it seems to me that the blame is misplaced.

What is disturbing MOST of all, is that we are seeing what people are REALLY thinking and there is a LOT of ugly there.

If you agree that sites that encourage conflict are a problem, you have to start somewhere. I’d argue that Facebook — which is basically a massive, globe-girdling hate and conflict amplifier — is the right place to start.

YouTube is a huge problem as well. Taking on Facebook — and what we learn from doing so — will help with approaches to the YouTube problem as well. We can do both.

Stop Facebook, YouTube, and Instagram from promoting hate, falsehood, jealousy, and depression, and you’ve made a pretty good start on the problem.

If you don’t believe me, take a look at how much weaker and less influential Donald Trump’s ideas are now that he is banned from Facebook, Instagram, YouTube, and Twitter. He’s still there. He’s still saying the same stuff. But now there are not 50 million people on a social network spreading everything he says. Deplatforming works. So will reining in, regulating, breaking up, and crushing toxic platforms.

I still want to crush Facebook. I’m not backing down. How about you?

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

2 Comments

  1. When a news provider is taken over by people who merely use it as a cash cow, what bleeds or shouts leads, and society pays. But when news providers support journalism with a bias towards democracy and transparency, it is informative, and society benefits. Facebook claims its goal is to bring people together, but it acts as a platform for clickbait, ads, and outrage. It has wealth and influence equal to some foreign countries. Perhaps imposing sanctions… Perhaps the algorithm could be changed to amplify legitimate news outlets.

  2. Thanks for this Josh. Totally agree and support any and all moves to reign in the outrageously over-amplified power these platforms governed by super-rich megalomaniacs have acquired. As you mention, other news distributors are held accountable; these must be too.