Mark Zuckerberg defended Facebook this weekend regarding the role of fake news in the election. It’s way worse than he says, and he has a lot of work to do to fix it. Ignorance is the biggest problem in our democracy, and Facebook is making it worse.
First, let me ask you: was your news feed full of questionable “news” during the run-up to the election, or was it accurate? Please post a comment and share your experience.
A taxonomy of fake news
My experience, and that of the people I connect with, was that there was a lot of inaccurate crap in my newsfeed. It’s not all “fake news.” It falls into various categories:
- False statements. For example, “Trump won the popular vote,” debunked by Snopes. This actual fake news.
- Parody masquerading as fact. For example, “Caitlyn Jenner joins Donald Trump’s Transition Team,” from “The National Report.”
- Parody that’s clearly parody. Like The New Yorker’s Borowitz Report (“Trump confirms that he just Googled Obamacare.”)
- Slanted news. Yes, Fox News and MSNBC, but also a host of sites like Breitbart.com that selectively report facts that support their positions. (And yes, Trump just appointed Breitbart’s former executive chairman as his chief strategist.)
- Obsolete news. For example, a highly intelligent friend of mine recently posted an article about Maryland dumping the electoral college for a scheme to give all its electoral votes to the winner of the popular vote if enough other states do the same. This is true. It also happened in 2007, which was the date on the article.
The problem is worse because the real news is so surreal now. The FBI director is investigating Hillary Clinton’s emails on the laptop of an accused sex offender? Donald Trump said you should grab women by the pussy? Real news, but hard to believe. And when that is the real news, stuff like the Caitlyn Jenner article doesn’t seem so implausible.
Zuckerberg’s defense is weak
How does Zuckerberg respond to Facebook’s role in this? With denial. Here’s some of his statement posted on Facebook. I added the bold to show weasel words.
After the election, many people are asking whether fake news contributed to the result, and what our responsibility is to prevent fake news from spreading. These are very important questions and I care deeply about getting them right. I want to do my best to explain what we know here.
Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.
Sorry, Zuck, this is bullshit. Any time you see the word “deeply,” your bullshit detector should go off. In this case, here’s why he’s wrong.
- While it’s true that there are fake news stories from all political viewpoints, that’s not what people see. Because of the Facebook algorithm, they only see fake news from the viewpoint that matches their own. If you’re liberal, you see fake news that confirms your belief, and the reverse is true for conservatives.
- I saw hundreds of fake stories — a lot more than 1%. And it was clear that my friends who shared them were duped into believing they were real (or worse yet, didn’t care if they were real when they spread them).
- Zuckerberg’s definition of “fake” is too weak — it doesn’t include parodies, slanted news, or obsolete news. It’s not just a fake news problem.
- Facebook can’t even solve its fake ads problem. It’s ads are often sites with false claims (Sly Stallone is dead) and fake logos, masquerading as legitimate news sites to sell nutritional supplements.
- In a close election like this, it only take a few weak-minded, ignorant idiots to change the outcome.
How fake news changed the results of the election
Consider a state like Pennsylvania, for example. Trump won by 64,000 votes out of 6 million votes cast, or 1% of the total vote. Imagine, for the sake of argument, that 2.8 million people voted for Hillary Clinton because of a belief based on the truth, and 2.8 million voted for Donald Trump based on a true perception of him.
That leaves 2% of voters who don’t have a true perception of either candidate. Where did they get their information?
Is it that hard to believe that 2% of the people in Pennsylvania, or anywhere, are weak-minded idiots? I know they exist, because I’ve seen them on the news and in my Facebook feed. If you believe the stupidest 2% of Pennsylvanians are still smart enough to spot fake news, you’ll have to convince me why.
If fake or distorted news drives more weak-minded, ignorant idiots in Pennsylvania to vote Trump — or drives more idiots leaning toward Clinton to give up, stay home, or vote for a third party — he wins the state. Notice that you don’t have to believe all or even most of the Trump voters believe this stuff. All it takes a few ignorant people in a few swing states to make the difference.
Did that happen in Pennsylvania, Wisconsin, Michigan, Ohio, North Carolina, and Florida? For that matter, did it happen in Clinton’s favor in New Hampshire, Colorado, Nevada, or Colorado? It’s certainly plausible.
How to solve the problem
Zuckerberg says he’s working on the problem:
This is an area where I believe we must proceed very carefully though. Identifying the “truth” is complicated. While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted. An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual. I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves.
It is complicated. We need a system (like Verytas, the browser extension that Sam Mallikarjunan proposed) that marks sites and articles as fake, satirical, slanted, or obsolete, using a community-based system. The system needs to rate the raters as well as the articles, to ensure that partisans on one side or the other can’t skew it. (Only if you are left-right balanced in your news scoring will your ratings have weight.)
This isn’t easy. But it is necessary. All of the skullduggery on these sites is at Facebook’s feet. You can blame the partisan fake-news spreaders all you want. But you can’t stop them, any more than you can stop a cockroach infestation by stepping on them. You need to clean up the kitchen. It’s Facebook’s kitchen to clean up.
I’d like to start a non-profit, non-partisan tech organization to force this change. Are you with me? Add your name in the comments, along with the skills you bring.