The 8 Expectations of Privacy
How we built a shared language for privacy at Meta, formerly known as Facebook.
May 2023

Joining Facebook was not in my plans. After Apple acquired my startup, I was going to build another company. I got a call that convinced me to try it for a year first. I stayed for three and a half years.

What kept me was the problem. Privacy at Facebook in 2020 was not a solved thing. Engineers, lawyers, product managers, and executives were all making privacy decisions every day using different frameworks. The result was inconsistency, mistrust, and a company that had lost the public's trust on one of the most fundamental questions in technology.

My job was to change that. The 8 Expectations of Privacy were the result.

But first, some context on where we were.

I think of privacy as having gone through three eras. The first was the No Privacy Era — before 2014, when tech companies could largely do whatever they wanted with user data. The second was the Product Principle Era, when companies started taking privacy seriously as a strategy. You saw this with WhatsApp's end-to-end encryption, Apple's privacy marketing, DuckDuckGo's private browsing. Each had its own flavor. By 2023, we had entered a third era: the High Bar of Privacy. It is no longer enough to protect users with good product principles. You also have to be defensible to regulators. You are now serving two parties, not one.

The 8 Expectations were built for this third era. Every product at Facebook had to meet all eight before it could launch: purpose limitation, data minimization, data retention, external data misuse, transparency and control, data access and management, fairness, and accountability.

The framework behind them comes down to a few core principles.

The first is purpose limitation. Data should only ever be used for the specific reason it was collected. Not for a related reason. Not for a future reason that seems reasonable at the time. If you want to use data for something new, you go back to the beginning and justify it again. This is the discipline that prevents the slow drift from "we collected this for X" to "we're now using it for everything."

The second is transparency. A relationship between a company and a user is the same as between two people. Everyone gives you the benefit of the doubt when they meet you first. But if you lie to them or hide the asterisk about how you use their data, trust is easy to lose and hard to rebuild. Don't hide it. Be upfront.

The third is proportionality. The value a person gets from a product must exceed the privacy they give up to use it. If there is no product value, do not collect the data. Every team believes their feature is worth the data it requires. Most of the time, they are wrong.

The fourth is what I call the Grandpa and Child framework. If a grandparent and a ten-year-old can explain back to you the privacy narrative in plain language, you have reached simplicity. If they cannot, you have not thought clearly enough about what you are doing. Complexity is often a cover for practices that would not survive scrutiny.

The last principle is accountability. Good intentions are not enough. You need systems and technical controls that make it hard to violate these principles, not just policies that say you shouldn't. Privacy has to be enforced, not just stated.

The hardest lesson I learned building this framework is that privacy is not a legal problem. It is a design problem. You cannot audit your way to trustworthiness. You have to build it in from the start.

2023 note: After this work, I went to Facebook AI Research (FAIR) — now known as Meta FAIR — to work on next-generation AI models. The same questions apply. The same principles apply. We are building systems that will make consequential decisions about people at a scale that makes the privacy problems above look small. Purpose limitation, proportionality, and accountability are not privacy concepts. They are the foundation of building any powerful system responsibly.