Apple’s in a bind. The FBI wants them to crack the encryption on a San Bernardino terrorist’s iPhone. Apple believes that’s a dangerous precedent. Apple CEO Tim Cook’s open letter is breathtakingly simple and clear. Learn from it.
Here’s Apple’s logic: Breaking encryption creates a “back door.” Any such back door would inevitably get out. Thieves and foreign governments could use it. And then none of our data would be safe. Financial flows and everyday privacy use similar encryption — this precedent would threaten the same problems in those domains.
This is an unpopular position. Gloss over the details and it seems like Apple is protecting a terrorist. As Donald Trump, as always articulating the simplistic view, said, “Who do they think they are?”
In this situation, a press release would be useless. Instead, Tim Cook published an 1,100-word, plain-language open letter explaining the company’s position. I’ll take it apart and show you how and why it works. Excerpts below, with my comments in brackets.
A Message to Our Customers
[Apple starts with customers, rather than itself, reframing the issue starting with the opening heading]
The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand. [There’s no “soft warmup.” Cook clearly and directly states the conflict in the first 33 words. Both sentences are active voice — one states what the government wants, the other what Apple is doing. There is no jargon. Everything you write should start with this level of clarity, brevity, and directness, especially when the issue is contentious.]
The Need for Encryption
Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.
All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data. [Apple starts not by talking about its position, but by talking about “our” data. This makes it personal. Note, as well, how this explains encryption in simple jargon-free terminology. I’m not crazy about “incredible” and “deeply committed,” but compared to the weaselly superlatives in most announcements, this is restrained.]
The San Bernardino Case
. . . The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.
When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone. [This section is so controversial, it could have been called “Why we defied the FBI.” But Apple clearly and simply explains that (1) they are cooperating and (2) the limits of their cooperation. Note the use of “we” — Tim Cook writes of Apple in the first person and does not hide. Instead of describing the situation, he offers a narrative. This is far more effective and relatable than a press release could ever be.]
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control. [This describes Apple’s main point of disagreement with the government without accusing or demonizing anyone. Cook restricts the argument to facts and its positions. If you’re going to argue, this is how to do it.]
The Threat to Data Security
. . . In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. [This is the heart of Apple’s case — that it would be creating a key that would pierce every user’s security. The analogy of the key is simple, powerful, and clear to anyone. There’s lots of passive voice in this passage (highlighted in bold italic). So who’s the missing actor? In this case, it’s hackers and thieves that would defeat the encryption and unlock the data. Cook hides that with passive voice to concentrate the attention on the victim — you — rather than the anonymous attacker.]
A Dangerous Precedent
. . . The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge. [This is the heart of Apple’s case, and it’s overstated. But notice how Apple makes the case. It’s about “you” — your iPhone, your messages, your health records, and your phone’s microphone. Apple describes security armageddon in simple, calm language. You cannot read this without wondering what it means, not just for the world, but for you, personally.]
While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect. [Tell ’em what you’re going to tell ’em, tell ’em, tell ’em what you told ’em. This is the “tell ’em what you told ’em” bit. At the end, remind people what your position is. Apple did this in just 45 words.]
Tim Cook [No hiding — it’s from the CEO. He makes it clear that you can hold him responsible for these words.]
Here’s the lesson: if you have to communicate in a crisis, use clear, simple, direct language. Use “I”, “we,” and “you.” Use clear analogies. Avoid jargon. Describe the impact on customers. And don’t hide who you are or what you mean.
Apple is winning this PR battle. Even the Los Angeles Times, hometown paper of many in San Bernardino, came out behind it. Defending encryption and standing up to the FBI requires this level of clarity. Learn from it.
More on this topic: Cook’s letter to employees and Answers page