In the wake of the Cambridge Analytica scandal, Facebook and its CEO Mark Zuckerberg appear to be in trouble. Zuckerberg published an explanation, participated in an interview with The New York Times, and appeared on CNN. People are upset and some Facebook users are quitting Facebook. But nothing is going to change — Facebook will continue to grow, Zuckerberg will continue to lead it, people will continue to use it, there will be no meaningful regulation, and third parties will continue to get access to data.
After two days of silence, Zuckerberg has finally published a statement. The 900-word statement is extremely straightforward and devoid of emotion. Here are key excerpts:
I want to share an update on the Cambridge Analytica situation . . .
We have a responsibility to protect your data . . . The good news is that the most important actions to prevent this from happening again today we have already taken years ago. But we also made mistakes, there’s more to do, and we need to step up and do it.
Here’s a timeline of the events [goes on to describe how Aleksandr Kogan’s app got access to data and shared it, and how Cambridge Analytica lied in certifying that it deleted the data.]
In 2014, to prevent abusive apps, we announced that we were changing the entire platform to dramatically limit the data apps could access. . . . We also required developers to get approval from us before they could request any sensitive data from people. These actions would prevent any app like Kogan’s from being able to access so much data today. . . .
Cambridge Analytica claims they have already deleted the data and has agreed to a forensic audit by a firm we hired to confirm this. We’re also working with regulators as they investigate what happened.
This was a breach of trust between Kogan, Cambridge Analytica and Facebook. But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that. . . .
First, we will investigate all apps that had access to large amounts of information before we changed our platform to dramatically reduce data access in 2014, and we will conduct a full audit of any app with suspicious activity. . . .
Second, we will restrict developers’ data access even further to prevent other kinds of abuse. For example, we will remove developers’ access to your data if you haven’t used their app in 3 months. We will reduce the data you give an app when you sign in — to only your name, profile photo, and email address. . . .
Third, we want to make sure you understand which apps you’ve allowed to access your data. In the next month, we will show everyone a tool at the top of your News Feed with the apps you’ve used and an easy way to revoke those apps’ permissions to your data. . . .
Beyond the steps we had already taken in 2014, I believe these are the next steps we must take to continue to secure our platform. . . .
This bloodless statement lacks an apology. Zuckerberg explains what they will fix, but while he takes responsibility, he doesn’t admit they did anything wrong. On the CNN interview he did apologize. The New York Times interview is remarkably similar to the CNN interview — Zuckerberg was clearly well coached.
Why nothing will change
This story is all over every news organ imaginable. Dozens of people on my Facebook feed have posted that they are considering leaving Facebook. The FTC is looking into whether Facebook violated a consent degree about privacy that it agreed to in 2011.
From a dramatic storytelling standpoint, things should change.
But I don’t think they will.
Here are some observations and predictions.
- Facebook is the opiate of the masses. Facebook (and Instagram) are how people spend the moments in their day when they are doing nothing else. They are addicted to the updates from friends, the memes, the news stories that titillate and outrage them, and the ability to post their comments on the minutiae of their days. Having become addicted to that, no one wants to have it taken away. Regardless of its public statements, Facebook knows that its users, in the end, want it to continue pretty much as it has.
- The data that Facebook shared is personal, but not sensitive. The reason that Zuckerberg calls what happened a “failure of trust” and not a breach is that Facebook acted in accordance with its terms of service. Those who are outraged don’t see it that way. But ordinary people will instantly give up a metric ton of privacy in exchange for half a gram of convenience. (That’s why they took those quizzes in the first place.) No one is going to have their identity stolen as a result of what happened at Cambridge Analytica. No one’s private medical records are now public. Cambridge Analytica used the data in aggregate to build models. This story is a big deal because Facebook is ubiquitous, not because of the personal impact on individuals. Deep down, people know that Facebook has their data — after all, they made all those posts themselves — and they don’t care.
- Privacy violations don’t bring down companies. The only case I can think of where a data breach brought down a company was Ashley Madison. That was an actual data breach of sexual secrets, and Ashley Madison was a very small company. Equifax is still here. Target is still here. Yahoo continues to operate. Nearly all of their customers have not deserted them (at least, not because of the violations). And their violations were much worse than what Facebook did. These violations make people feel bad, but soon after, they resign themselves to what happened and go back to doing exactly what they used to do.
- Zuckerberg isn’t going anywhere. Zuck has a controlling interest in Facebook. Its stock price is a direct result of the strategic decisions he has made. Shareholders cannot force him out, and they will not try. He dodged calls for him to resign in 2012. He dealt with the scandal of liberal bias in trending topics and continued. He dealt with the accusations of allowing Russian trolls to manipulate the election and continued. His modus operandi is to identify problems, think them through, and try to fix them. That’s what he’s doing here. And he knows, with 2 billion users behind him, he doesn’t have to go anywhere.
- No meaningful regulation will occur. Facebook will be further regulated in some small way. Zuck admitted this in his CNN interview. But governments are too clumsy and lack the tools and knowledge to actually do anything significant about Facebook. Anything that slows down Facebook or interferes with users’ interaction with it will be very unpopular. Politicians know this. So they will hold hearings and make pronouncements, but nothing meaningful will change.
What will happen
This is a PR crisis, not a corporate crisis for Facebook. The changes that will happen will similarly be about image, not substance.
- Zuckerberg will testify. Zuckerberg (or possibly Sheryl Sandberg) will appear before Congress. They will ask tough-sounding questions. Zuckerberg (or Sandberg) will make calm answers without promising to change anything. It will be moderately good theater. Afterwards, things will be exactly as they were before.
- Meaningless regulation will happen. Regulators will place limits on data sharing. They may change rules about advertising. These will have no effect on the Facebook experience or its operating model. They may make people in government and those who are upset about Facebook’s past activities feel good for a moment.
- Facebook may pay a fine. Governments in the US, UK, or Europe may fine Facebook for bad behavior and lax enforcement. The fine will be less than a billion dollars. Facebook makes $16 billion a year in profit, and that number is growing rapidly. It will pay the fine and move on.
- Facebook will make incremental changes to reduce privacy problems. Zuckerberg told the New York Times that the company has made progress in identifying and purging election trolls. His new policy regarding third-party use of data, and the audits Facebook will conduct on past data use, will squelch the problems that have occurred so far. What goes wrong in the future will be different from what has gone wrong so far. Facebook will describe this as iterative progress.
I stand by these predictions. I’m sorry if they aren’t emotionally satisfying to you, but it’s my job to describe what is most likely to happen, not what people want to happen.
See you on Facebook.