If you had profited from a highly addictive drug that had done great damage to society, what would you do? A trial about this — about Insys and fentanyl — is happening right now in Boston. But I’m not talking about opioids. I’m talking about Facebook.
Mark Zuckerberg’s op-ed in the Wall Street Journal looks different when you realize he’s created billion of addicts. Here’s the text of his op-ed with my translation in italics.
The Facts About Facebook
We need your information for operation and security, but you control whether we use it for advertising.
Facebook turns 15 next month. When I started Facebook, I wasn’t trying to build a global company. I realized you could find almost anything on the internet—music, books, information—except the thing that matters most: people. So I built a service people could use to connect and learn about each other. Over the years, billions have found this useful, and we’ve built more services that people around the world love and use every day.
Recently I’ve heard many questions about our business model, so I want to explain the principles of how we operate.
I didn’t realize at first that connecting people to each other online would be addictive. But billions of people now can’t live without a daily hit of Facebook. I’ll explain how we made that happen.
I believe everyone should have a voice and be able to connect. If we’re committed to serving everyone, then we need a service that is affordable to everyone. The best way to do that is to offer services for free, which ads enable us to do.
As Tom Lehrer wrote in his song “The Old Dope Peddler,” “We give the kids free samples, because we know full well/That today’s young innocent faces, will be tomorrow’s clientele.” Addiction works better when it’s free. Don’t worry about us, we get plenty of money from ads.
People consistently tell us that if they’re going to see ads, they want them to be relevant. That means we need to understand their interests. So based on what pages people like, what they click on, and other signals, we create categories—for example, people who like pages about gardening and live in Spain—and then charge advertisers to show ads to that category. Although advertising to specific groups existed well before the internet, online advertising allows much more precise targeting and therefore more-relevant ads.
It’s painless at first, but we’re draining the data from you every moment you’re high on Facebook. That’s why you feel worn out and enervated after using it.
The internet also allows far greater transparency and control over what ads you see than TV, radio or print. On Facebook, you have control over what information we use to show you ads, and you can block any advertiser from reaching you. You can find out why you’re seeing an ad and change your preferences to get ads you’re interested in. And you can use our transparency tools to see every different ad an advertiser is showing to anyone else.
Still, some are concerned about the complexity of this model. In an ordinary transaction, you pay a company for a product or service they provide. Here you get our services for free—and we work separately with advertisers to show you relevant ads. This model can feel opaque, and we’re all distrustful of systems we don’t understand.
Ads are toxic. But we deliver them at a level that you can tolerate. If you dislike one toxin, you can ask us to replace it with another.
Sometimes this means people assume we do things that we don’t do. For example, we don’t sell people’s data, even though it’s often reported that we do. In fact, selling people’s information to advertisers would be counter to our business interests, because it would reduce the unique value of our service to advertisers. We have a strong incentive to protect people’s information from being accessed by anyone else.
Once we extract your data, we never give it to anyone else. We tried that, and it ended badly. We know best what toxins you can tolerate, so we’re not going to let anyone else have access to the data we use to calibrate that. Trust us, it’s safer that way.
Some worry that ads create a misalignment of interests between us and people who use our services. I’m often asked if we have an incentive to increase engagement on Facebook because that creates more advertising real estate, even if it’s not in people’s best interests.
We’re very focused on helping people share and connect more, because the purpose of our service is to help people stay in touch with family, friends and communities. But from a business perspective, it’s important that their time is well spent, or they won’t use our services as much over the long term. Clickbait and other junk may drive engagement in the near term, but it would be foolish for us to show this intentionally, because it’s not what people want.
We’re very careful to make Facebook as addictive as possible. We won’t give you quick hits that might make you feel like you’ve overdosed. A steady drip of interactions with family and sharing of questionable news and memes keeps you coming back without making you too sick.
Another question is whether we leave harmful or divisive content up because it drives engagement. We don’t. People consistently tell us they don’t want to see this content. Advertisers don’t want their brands anywhere near it. The only reason bad content remains is because the people and artificial-intelligence systems we use to review it are not perfect—not because we have an incentive to ignore it. Our systems are still evolving and improving.
We’re sorry if you have a bad trip. We can’t always stop that.
There’s no question that we collect some information for ads—but that information is generally important for security and operating our services as well. For example, companies often put code in their apps and websites so when a person checks out an item, they later send a reminder to complete the purchase. But this type of signal can also be important for detecting fraud or fake accounts.
We give people complete control over whether we use this information for ads, but we don’t let them control how we use it for security or operating our services. And when we asked people for permission to use this information to improve their ads as part of our compliance with the European Union’s General Data Protection Regulation, the vast majority agreed because they prefer more relevant ads.
Ultimately, I believe the most important principles around data are transparency, choice and control. We need to be clear about the ways we’re using information, and people need to have clear choices about how their information is used. We believe regulation that codifies these principles across the internet would be good for everyone.
Extracting data from you is crucial to making this addiction work and regulating the toxins we put into your body. We’d like regulators to endorse the way we do this so we’re not at risk of losing our legal right to addict you.
It’s important to get this right, because there are clear benefits to this business model. Billions of people get a free service to stay connected to those they care about and to express themselves. And small businesses—which create most of the jobs and economic growth around the world—get access to tools that help them thrive. There are more than 90 million small businesses on Facebook, and they make up a large part of our business. Most couldn’t afford to buy TV ads or billboards, but now they have access to tools that only big companies could use before. In a global survey, half the businesses on Facebook say they’ve hired more people since they joined. They’re using our services to create millions of jobs.
We’re balancing the needs of billions of addicted people and millions of people who want to deliver toxins to them. It’s vastly profitable to control a mass of addicts. So we work hard to keep the addiction strong and the toxin levels manageable.
For us, technology has always been about putting power in the hands of as many people as possible. If you believe in a world where everyone gets an opportunity to use their voice and an equal chance to be heard, where anyone can start a business from scratch, then it’s important to build technology that serves everyone. That’s the world we’re building for every day, and our business model makes it possible.
For us, technology has always been about addicting as many people as possible, making them feel powerful when they are actually powerless. That’s better living through technology.
Take another hit
Gotta go . . . I’m jonesing for another hit of Facebook. It’s so convenient, and it’s free!
For another perspective, see this from Kara Swisher.