Could Facebook read encrypted messages? Would it?

A week ago, Facebook announced a shift to focus more on messaging (or as Mark Zuckerberg designated it, Facebook’s “privacy-focused” vision). I suggested that this would allow the company to personalized its ads even more. Would this even be possible if, as the company promised, the messages would be encrypted?

A guy I respect on LinkedIn posted this comment:

Holy moly! I thought they were saying that the messages would be encrypted so that even Facebook would be unable to read them. (Which I interpreted as switching to a new business model, where they would start charging for FB Messenger after a certain number of free messages… which I think is what users actually want in order to pay for the service.). Are you saying they will unencrypt the messages & sell advertising based on the content? . . . 

I would consider “reading and scoring” to be—for all intents & purposes—*exactly* the same as not encrypting at all (or breaking the encryption). What would be the point of publicly announcing a major new privacy strategy by a company whose reputation has been trashed & destroyed, and then go ahead & sabotage it? Wouldn’t it be faster for Mark Zuckerberg to just put a gun in his mouth & pull the trigger?

This is worth discussing, so lets’s discuss it.

What did Zuckerberg say about encrypted messages?

Here are the parts of Zuckerberg’s post that relate to encrypted messages.

I believe the future of communication will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure and their messages and content won’t stick around forever. This is the future I hope we will help bring about. . . . 

This privacy-focused platform will be built around several principles:

Private interactions. People should have simple, intimate places where they have clear control over who can communicate with them and confidence that no one else can access what they share.

Encryption. People’s private communications should be secure. End-to-end encryption prevents anyone — including us — from seeing what people share on our services. . . .

In a few years, I expect future versions of Messenger and WhatsApp to become the main ways people communicate on the Facebook network. We’re focused on making both of these apps faster, simpler, more private and more secure, including with end-to-end encryption. . . . 

Encryption and Safety

People expect their private communications to be secure and to only be seen by the people they’ve sent them to — not hackers, criminals, over-reaching governments, or even the people operating the services they’re using.

There is a growing awareness that the more entities that have access to your data, the more vulnerabilities there are for someone to misuse it or for a cyber attack to expose it. There is also a growing concern among some that technology may be centralizing power in the hands of governments and companies like ours. And some people worry that our services could access their messages and use them for advertising or in other ways they don’t expect.

End-to-end encryption is an important tool in developing a privacy-focused social network. Encryption is decentralizing — it limits services like ours from seeing the content flowing through them and makes it much harder for anyone else to access your information. This is why encryption is an increasingly important part of our online lives, from banking to healthcare services. It’s also why we built end-to-end encryption into WhatsApp after we acquired it.

In the last year, I’ve spoken with dissidents who’ve told me encryption is the reason they are free, or even alive. Governments often make unlawful demands for data, and while we push back and fight these requests in court, there’s always a risk we’ll lose a case — and if the information isn’t encrypted we’d either have to turn over the data or risk our employees being arrested if we failed to comply. This may seem extreme, but we’ve had a case where one of our employees was actually jailed for not providing access to someone’s private information even though we couldn’t access it since it was encrypted. . . .

On balance, I believe working towards implementing end-to-end encryption for all private communications is the right thing to do. Messages and calls are some of the most sensitive private conversations people have, and in a world of increasing cyber security threats and heavy-handed government intervention in many countries, people want us to take the extra step to secure their most private data. That seems right to me, as long as we take the time to build the appropriate safety systems that stop bad actors as much as we possibly can within the limits of an encrypted service. We’ve started working on these safety systems building on the work we’ve done in WhatsApp, and we’ll discuss them with experts through 2019 and beyond before fully implementing end-to-end encryption. As we learn more from those experts, we’ll finalize how to roll out these systems. . . .

Of course, the best way to protect the most sensitive data is not to store it at all, which is why WhatsApp doesn’t store any encryption keys and we plan to do the same with our other services going forward.

When reading this, you need to remember that Zuckerberg has announced actions before that Facebook has not applied in the way it seemed they would: content moderation, for example. But let’s take Zuckerberg at his word.

In my interpretation of these statements, here are the things Facebook would not do.

  • It wouldn’t store messages that are intended to expire.
  • By encrypting the messages, it would ensure that they cannot be intercepted and decoded in transit.
  • It would not keep a record of the messages in any central database.
  • It would never share messages with advertisers.
  • It would make itself unable to turn messages over to law enforcement or government authorities.

So this appears to contradict my earlier statement that Facebook could improve targeting based on the content of messages. Or could it?

How Facebook could target ads based on messages

Messages include two elements: data and metadata. The metadata is data about the message, including when it was sent, where it was sent from, what device it was sent from, who sent it, who it was sent to, whether the intended recipient got it, whether they responded to it, how long it took to get a response, and what other things were happening at the same time. And let’s not forget the data Facebook already has about you, both from your activities on Facebook (including who you respond to and what you click on) and your activities on other websites that have Facebook buttons or just share information with ad networks.

Facebook will surely use this combination of existing data on you and metadata to improve its ad targeting. That type of targeting would be unlikely to surprise anyone.

But what happens when you send a message and Facebook or WhatsApp encrypts it?

The message exists in plain, unencrypted text first (it has to, so you can see it as you are typing it).

A module in the WhatsApp or Facebook Messenger client has to read that text and then encrypt it.

That “prepare, read, and encrypt” module will become increasingly sophisticated. It is likely to do things like suggest emojis, check URLs and pull up the featured graphics on the linked pages, and suggest other people to include in the message. After all, Facebook wants to make sure its messaging clients are as helpful as possible.

So don’t tell me Facebook doesn’t “know” what’s in the message. Of course it knows. More specifically, the client software knows, even if it doesn’t share information with “Facebook” the vast cloud-based data collection.

I’m suggesting that the client might also extract certain values from the message — e.g. “This mentions clothing” or “This is about air travel” or “This matches messages from other Donald Trump supporters” and score the message before encrypting it. Is my correspondent right that doing this would be tantamount to Zuckerberg shooting himself in the face?

Well, let’s remember all the other times Zuckerberg and Facebook have deceived us on privacy: for example, on sharing data with an academic who shared it with the political organization Cambridge Analytica, on allowing advertisers to see Facebook users’ private messages, and on using the phone number you gave it exclusively for two-factor authentication in ad targeting.

Zuckerberg doesn’t get the benefit of the doubt here. What do you think would happen if Facebook was extracting information from private messages and using it to sharpen ad targeting? Would it really backfire? Or would you just read a statement like this?

I’d like to address the recent accusations that Facebook has violated the privacy of users sending private messages on Facebook Messenger, Instagram, and WhatsApp.

Back in 2019, I promised that Facebook would use end-to-end encryption to protect the privacy of your messages. We did that. No one can read your messages without permission. If you set them to disappear, they do — permanently. We do not store them, and of course, we never share the content of messages with anyone, including advertisers or governments. Because they are encrypted, no one can read them except the sender and the recipient.

The only way that the encryption software can work is for it to read the original message before encrypting it. Some of our engineers have been building additional features into that messaging client. And it now appears that among those features are the ability to draw general conclusions about the message — is it illegal, for example, or does it need to be delivered urgently.

Some of those conclusions have also been incorporated into users’ profiles. This enables us to better serve you with a messaging client better attuned to your particular needs. While we never share this information with advertisers, because it adds to our knowledge of you as a users, it enables us to better serve you with content, including advertisements, that is more likely to be relevant to you.

We stand by our pledge to never share encrypted information with anyone. However, because of the objections this has raised, we have now added a privacy setting that allow you to eliminate all profile information associated with messaging. While this will make the messaging client less convenient for many users, we wanted to make sure that users who prefer this extreme version of privacy will have the option.

We remain committed to your privacy — you can rest assured that no one is able to read your messages. Thanks for continuing to use our messaging services; we will continue to earn your trust.

To be clear, Zuckerberg didn’t write this: I ghostwrote it for him. But doesn’t it sound like what he would write?

Would this create an outcry? Of course it would.

Would it count as Facebook violating its promise? Many would say it did, but Facebook would say it didn’t.

Would it amount to Zuckerberg shooting himself in the face(book)? Or is it just more of the same hair-splitting, waffling, and excuses in a world that, addicted to social media, will keep using Facebook’s products regardless of these hiccups and scandals?

It’s software. There’s always room to wriggle away from responsibility. And Facebook and Zuckerberg have proven that wriggling away is one of their core skills.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

2 Comments

  1. Josh

    You’ve articulated why I’ve left Facebook. I can’t count how many times Zuck and company tried to weasel out of a situation such as the all-too-real one that you ghostwrote here. I didn’t know about the TFA scandal and have shared it with my students. It’s amazing to me how many second chances people will give Facebook. Cambridge Analytica for me was the final straw.

  2. I can see why you might think the statement from Facebook could mean:

    “It would not keep a record of the messages in any central database.”

    But that is also a key part of FB messenger. It’s tempting to think of a FB message like a post card or letter where it is created in one place, then it flies through the internet, and arrives somewhere else – but that is not how messenger, or email, or text messages work. There is always a central database of messages and when you read them, all you are doing is downloading a copy of the message to your device. It has to work that way so that you can see your messages using your phone, or your laptop, or your tablet, or your work computer etc.

    I don’t think there is anything necessarily underhanded in that – but often people don’t think about how common services like messenger really work when they are trying to understand privacy issues or to advocate for laws etc.