An open letter to Mark Zuckerberg, CEO and Sheryl Sandberg, COO, Facebook:
False news is out of hand. People read lies and obliviously share them on Facebook.
You can solve this problem. Not only that, you can solve it in a scalable way, without censoring anything and without changing your main sharing algorithm. Let’s see how.
First, the scope of the problem. The Web is full of sites that just make things up. I’m talking fake news, not obvious satires like The Onion. Many of my otherwise rational friends share outrageous but believable news from sites like the misleadingly named nbc.com.co, which falsely announced, for example, that a pastor in Vermont was jailed for refusing to perform a gay marriage. There are hundreds of sites now making up fake news. A few are trying to be funny, some are trying to cause trouble, and all are using Facebook to generate ad revenues.
The flood has overwhelmed the Washington Post’s Caitlin Dewey, who gave up on her column “What was fake on the Internet this week.” As she put it:
If you’re a hoaxer, it’s [now] more profitable. Since early 2014, a series of Internet entrepreneurs have realized that not much drives traffic as effectively as stories that vindicate and/or inflame the biases of their readers. Where many once wrote celebrity death hoaxes or “satires,” they now run entire, successful websites that do nothing but troll convenient minorities or exploit gross stereotypes.
It only takes a minute to check a story on Snopes.com or in a reputable news source. But a minute is too much. Most people would rather share an outrageous story than take a minute to check it.
Facebook optimizes its algorithm to generate as many views as possible. Since members like to share fake news, Facebook’s algorithm elevates its visibility. I know that you have finely tuned the algorithm to work well for your company, your advertisers, and your users. I wouldn’t ask you to change that.
Instead, you can solve the problem by creating an parallel algorithm, an automated “Truth Algorithm,” to help Facebook users evaluate content. The Truth Algorithm includes three parts, all of which need to work together:
- Each link post includes a Truth Icon. Any Facebook user can click on the Truth Icon and report the post as real, fake, or satire — just as anyone can “like” a post.
- The Truth Algorithm rates each user’s reliability. If a user frequently report items as false when other reliable users rate it as true, or vice versa, then that user’s reliability is low. If their reports track an identifiable bias (for example, rating all negative items about Obama as false), their reliability is also low. But if their reports are similar to those of other reliable users, then their reliability is high.
- The Truth Algorithm rates each post and shows the appropriate Truth Icon based on the reports of reliable users: A smile for true content, a frown for fake content, a laughing face for satire, or a question-mark when there is not enough information.
Some readers of this open letter may believe there is a chicken-and-egg problem here — you can’t rate items until you rate users, and vice versa. But as experts on social algorithms, Mr. Zuckerberg and Ms. Sandberg, you know better. Truth ratings for both news items and users will be an emergent social phenomenon. Each click on the Truth Icon will give Facebook more information about both posts and users. Users’ behavior will rapidly fall into patterns: for example, those who rate only conservative items as false, or those who align with other users who have checked Snopes or the original source. Similarly, the ratings on the news items will fall into patterns. The algorithm will recognize those patterns and classify both the users and the news items. Given the billions of article reads that take place every day, I expect the Truth Algorithm to quickly become accurate.
This algorithm does not require Facebook to make judgments about truth from content, which would subject it to all sorts of criticism regarding bias. It does not censor anything. And since it is an algorithm, it scales. It uses the intelligence of Facebook and its users to provide information to all of us about what we’re reading. Like all efficient social algorithms, it uses the activity of a few socially active news readers (those who carefully check the veracity of news items) to benefit the mass of people who just read and share news items.
Mr. Zuckerberg, you just agreed to give away 99% of your shares to charitable organizations. I know you have a conscience, you understand that this is important, and you know this will work. Facebook is now the primary way that people share news. You are responsible for helping us to evaluate this news and understand its truth, if you can. So please implement the Truth Algorithm and the Truth Icon.
Josh Bernoff, president of WOBS LLC, concerned American, and loyal Facebook member.