Is Meta right to blame its own users for spreading misinformation?

In an interview with Ina Fried of Axios, Andrew Bosworth, who will soon be the CEO of Meta, defended the spread of misinformation on the company’s platforms. The Washington Post described the argument as “Social media doesn’t hurt people. People hurt people.”

Is Bosworth right? After all, people post what they want on Facebook. Do we want to stop that?

They share things they see. Do we want to stop that?

People use mobile phones to plan terrorism and share child pornography. Do we blame the mobile phone companies?

Facebook is not a common carrier

I’d buy Meta’s argument if Facebook didn’t shape what’s shared the way that it does.

Right now, the algorithms at Facebook and Instagram surface what’s popular. This is what causes the most outrageous information to spread.

Meta is not neutral. Its algorithm prioritizes extremes. This is what makes it a vector for the worst content. And it’s why lies spread. They spread because people interact with them.

It’s yet another example of how the algorithm runs Facebook and Instagram, not the other way around.

A thought experiment

Imagine two public parks dedicated to free speech.

In the first park — call it Hyde Park — anyone can stand on a soapbox and speak. They can reach whatever crowd assembles from people walking by.

In the second park — call it Meta Park — again, anyone can stand on a soapbox and give a speech. But there are amplified speakers all around the park. And if a crowd reaches a certain size — say 20 people — then the person speaking has their voice amplified and broadcast throughout the park. And if the total audience of the speech, including the people gathered around the amplified speakers, exceeds 100 people, it gets louder and louder and gathers an ever bigger audience.

Both parks are allowing free speech. But Meta Park is amplifying the audience effect. If Meta Park’s speakers were all over the world, it would be a lot like Meta’s properties Facebook and Instagram.

I can imagine the maintainers of Meta Park saying “Hey, people are speaking and other people are listening. We’re just here in the middle helping.”

But only the most naive observer wouldn’t blame the proprietors of Meta Park if a crazed mob formed because a lone nut won the amplification process.

Meta is a vector for evil. Crush Facebook.

7 responses to “Is Meta right to blame its own users for spreading misinformation?

  1. I’m currently reading a book that might interest you, Josh: You Are Here: A field guide for navigating polarized speech, conspiracy theories and our polluted media landscape by Whitney Phillips and Ryan Milner.

  2. FB is a mirror that reflects us. Who makes posts popular. Not the algorithm–it’s the users.

    Social media has made everyone a publisher. It’s as if print newspapers published every letter to the editor.
    What they need to do is exercise editorial review. They try to automate it and they’re scared to do it because they have to address the jesting Pilate conundrum: “What is truth?” Who decides? Who benefits?

    The final solution is to kill the news feed, which is the goose that laid the golden egg. Short of that, they might adopt standards and enforcement more like LinkedIn.

  3. One thing that wasn’t part of the big picture is shadow-banning. Facebook does far more evil by shadow-banning and eliminating the “level” playing field they so eloquently bragged about in the beginning. Yes, they push the ones who promote the narrative, but they also throttle back exposure of many. The insidious part of that is the victim doesn’t really know they’ve been banned. A user has 200 friends / followers. A shadow-banned user might only appear in the feeds of a few. The user thinks all their friends see the post but they don’t.

    At 60-Seconds, they recently ran a test of the Facebook system. 60-Seconds is the MOST shadow-banned page in all of Facebook. Try a search. It will not turn up any results. But do a search for “Jihad” on Facebook. You come up with over 3-million “hits” ??? So anyway, they tried a test — they would send you a $25 Amazon gift certificate just for posting.

    I’ve got 165 “friends” followers. I “shared” the promotion. They got TWO responses. Myself and one other. Now, out of 165 followers, you’d think there would be more than a single response. The truth is, NOBODY saw the post on 60-Seconds, and when I shared it, they shadow-banned that post and only ONE of my followers saw it.

    Sure. A ragged test. But try that test over on the NPR page, or the Chuck Todd page. You’ll get a million contacts.

    There’s nothing you can do. We’re vastly out numbered. Facebook will continue, and will get worse and worse. They’re exposure proof, shame proof and retaliation proof. With a billion users, they could give a hoot about you or me.

    Nothing else to say.

    1. Fred, I can’t seem to verify your facts.

      When I search Facebook, although we are not friends there, the 60-seconds page appears in 6th position among pages. It says it is run by Fred Showker.

      The page has 18 followers. Given that the average “organic reach” of a Facebook page is 5%, you would expect a post from that page to reach an average on only one person.

      Facebook’s algorithms know when you are sharing your own page and they damp down the reach of your personal posts in that situation.

      The Alexa rank of 60-seconds.com is nonexistent and a total of 6 sites link to it. This is consistent with a very low level of traffic. Withoutbullshit.com ranks 411,000 and 175 sites link to it. NPR.org is ranked 635 and 118,648 sites link to it. The number of people sharing a given site affects how high often it appears in people’s Facebook feeds. This is one reason that NPR.org appears frequently, withoutbullshit.com appears rarely, and apparently, 60-seconds.com hardly appears at all. These results are consistent with Facebook’s usual methods for ranking sites, not with a shadow ban.

      A Google search on the site 60-seconds.com finds no matches for Amazon gift 25. Did you take it down?

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.