Skip to content
PRIVACY PROGRESS UPDATE

We have a responsibility to protect people’s privacy and give them control to make their own choices.

We believe in being transparent about how we are approaching privacy as a company.

Commitment to Changing

We’ve previously shared our commitment to changing our privacy approach and investing in efforts to ensure we protect people’s privacy. We are making progress on our work to build a stronger privacy foundation by designing processes and technical mechanisms that drive accountability and ensure privacy is everyone’s responsibility at Facebook. This work builds on our existing work designed to comply with global privacy/data protection laws.

Holding Ourselves Accountable

Since committing to making these changes, one thing we’ve heard is that being accountable also means being transparent, which is why we are sharing this update on our privacy progress. We want to provide you with a closer look at the work we’re doing to embed privacy across our company operations, the outcomes that this work is driving and the technical solutions we’re investing in to address privacy at scale. We hope this information will both help our community understand the work we’re doing to protect privacy and enable dialogue about the approach we’ll take from here.

Photo of Facebook CEO Mark Zuckerberg addressing employees at an outdoor town hall

“This is a new chapter for the company. Privacy is more central than ever to our vision for the future and we’re going to change the way that we operate across the whole company, from the leadership down and the ground up.”


- Mark Zuckerberg, Chief Executive Officer

OUR WORK

Our privacy work is a journey that will never end. While we’ve made significant investments over the past year, we’re committed to continually refining and improving our privacy program as we respond to evolving expectations and technological developments. We’ll continue to share our progress as we improve and evolve our program.

01. ACCOUNTABILITY FOUNDATION

We’ve designed a governance framework to foster accountability for privacy at every level of our company.

Photo of Chief Privacy Officer of Product Michel Protti on Facebook’s Menlo Park campus

“We’ve made important progress, but we still have a tremendous amount of work to do. We’re in the early phases of a multi-year and ongoing effort to evolve our culture, our operations and our technical systems to honor people’s privacy.”


- Michel Protti, Chief Privacy Officer for Product

We have designed our privacy program to scale and evolve over time. It includes a governance structure, privacy training and education that provide the foundation to enable privacy accountability across the company.

Governance

Privacy is everyone’s responsibility at Facebook — from our CEO and executives, to engineers and sales teams across the globe, and everyone across the company, we are all responsible for privacy. As a result, we have a cross-functional group of organizations across the company who provide engineering, legal, policy, compliance and product expertise that enable the design and implementation of our privacy program.

Led by Chief Privacy Officer, Product, Michel Protti, the Privacy Team is made up of dozens of teams, both technical and non-technical, focused solely on privacy and led by some of our most experienced leaders.

The Privacy Team is at the center of our company’s efforts to build a comprehensive privacy program. Its mission — to honor people’s privacy in everything Facebook does — guides this work.

But the Privacy Team is just one organization among many across the company that is responsible for privacy. There are thousands of people in different organizations and roles across Facebook who are working to embed privacy into all facets of our company operations, including public policy, privacy strategy, legal, to name a few. Getting privacy right is a deeply cross-functional effort, and we believe everyone at Facebook is responsible for that effort.

Led by Erin Egan, Vice President and Chief Privacy Officer, Policy, the Privacy Public Policy team leads our engagement in the global public discussion around privacy, including new regulatory frameworks, and ensures that feedback from governments and experts around the world is considered in our product design and data use practices, including during the course of our privacy review process.

The Privacy Legal team is both embedded in the design of our program, as well as tasked with providing counsel on what is legally required during the course of our privacy review process.

The Privacy Committee is an independent committee of Facebook’s Board of Directors that meets quarterly to ensure we live up to our privacy commitments. The Committee is made up of independent directors with a wealth of experience serving in similar oversight roles.

They receive regular briefings on the state of our privacy program and compliance with our FTC Order from our independent privacy assessor, whose job it is to review and report on our privacy program on an ongoing basis.

Internal Audit brings independent assurance on the overall health of our privacy program and the supporting control framework.

Privacy Education

Our goal is to make privacy a core responsibility for every employee at Facebook. Part of this requires we drive continuous privacy learning and education that spans training, internal campaigns, regularly provided privacy content and other dynamic resources.

A foundational component of our privacy education approach is delivered through our new hire and annual privacy training, which covers the core elements of privacy, including data handling, security and sharing. Through its educational videos and modules, our annual privacy training provides scenario-based and real-world examples of privacy risks aligned with Facebook’s business operations. The training culminates with an assessment to test employee understanding of the required privacy concepts.

We take privacy education seriously. If people that work at Facebook do not take the required training on time, they can face consequences including loss of access to internal systems.

Another way we drive privacy education is through regular communication to employees that we call “awareness and engagement.” In addition to traditional annual training courses, we deliver ongoing privacy content through internal Workplace channels, lightning talks with privacy leadership, internal Q&A sessions, a dedicated Privacy Week and an internal hub of on-demand privacy content to help guide decisions and processes.

And when we participate in external privacy events like Data Privacy Day, we drive internal awareness and engagement through internal channels to ensure everyone has an opportunity to participate.

Privacy Week drives cross-company focus on privacy, features internal and external speakers, and highlights key privacy concepts and priorities through engaging content and events that occur throughout the week.

Our lightning talks series highlights leaders across the organization, includes key focus areas on privacy and emerging privacy issues that are top of mind at the leadership level.

02. ACCOUNTABILITY IN PRACTICE

We’re improving how we operationalize privacy, including how we build new products.

We’ve made progress on our work to give people more control over their privacy, and our broader mission to honor people’s privacy in everything we do. We’ve done so by building processes, products and technical mechanisms that have laid the foundation for privacy and accountability across the company.

Photo of two people speaking at an internal roundtable

In order to put our accountability foundation into practice, we have designed processes, escalation paths and technical mechanisms that embed privacy across all facets of our company operations.

Risk assessments are essential to our ability to identify, assess and mitigate material privacy risks. We have designed a privacy risk assessment program that performs an annual assessment to identify, assess and address privacy risk across the company, as well as a process to assess privacy risk after an incident occurs. We will continue to evolve and mature our privacy risk assessment process as we assess it over time.

We have designed safeguards — operational activities, policies and technical systems that we have put in place — to address privacy risk and meet privacy expectations and regulatory obligations.

We have also developed processes that will regularly assess our safeguards for design and operational effectiveness. These processes include our control self-assessment process where safeguard owners test and report on their safeguard’s design effectiveness, as well as periodic assessments conducted by the Privacy Team.

The privacy review process is the process by which we assess privacy risks that involve the collection, use or sharing of people’s information and external representations about our privacy and security practices. The process is also designed to help identify and mitigate the privacy risks we identify.

Our development of products and features and reviews of new or modified practices are guided by our internal privacy expectations, which include:

  1. Purpose Limitation: Process data only for a limited, clearly stated purpose that provides value to people.
  2. Data Minimization: Collect and create the minimum amount of data required to support clearly stated purposes.
  3. Data Retention: Keep data for only as long as it is actually required to support clearly stated purposes.
  4. External Data Misuse: Protect data from abuse, accidental loss and access by unauthorized third parties.
  5. Transparency and Control: Communicate product behavior and data practices proactively, clearly and honestly. Whenever possible and appropriate, give people control over our practices.
  6. Data Access and Management: Provide people the ability to access and manage the data that we have collected or created about them.
  7. Fairness: Build products that identify and mitigate risk for vulnerable populations, and ensure value is created for people.
  8. Accountability: Maintain internal process and technical controls across our decisions, products and practices.

Privacy Review is a deeply collaborative, cross-functional process used to evaluate and comply with our compliance obligations, and identify and mitigate broad privacy risks that go beyond our legal requirements. It is led by our privacy team and is conducted by a dedicated group of internal privacy experts across legal, policy, and other cross functional teams with backgrounds in product, engineering, legal regulations, security and policy. This group is responsible for making privacy review decisions and recommendations.

As a part of the process, the cross-functional team evaluates privacy risks associated with the project and determines if there are any changes that need to happen before launch to control for those risks. If there’s no agreement between the members of the cross-functional team on what needs to happen, the team escalates to a central leadership review, and further to the CEO, if needed for resolution.

We have also introduced technical requirements and tools to enhance accountability and operate the privacy review process at scale.

We developed a centralized tool that is used throughout the project lifecycle for Privacy Review. It enables teams to manage all aspects of their privacy review submissions, including to search and manage historical and new privacy commitments that we have made as a company.

Over the next several years, we’ll leverage these tools to support continued investment in infrastructure improvements that will systematize the process of enforcing our privacy decisions. These changes will move human-driven processes to automated ones that will make it easier to consistently enforce our privacy commitments across our products and services.

In addition to developing a centralized tool for the process, we have introduced a technical implementation review that conducts review, verification and documentation of the technical implementation of privacy mitigations and commitments prior to product launch.

This process, integrated with the tools we use to build software at Facebook, enables us to verify that what is agreed to in the documented decision is in fact what is implemented.

Our Incident Management program operates globally to oversee the processes by which we identify, assess, mitigate and remediate privacy incidents. Although the privacy team leads the incident management process, privacy incidents are everyone’s responsibility at Facebook, with teams from across the company, including legal, policy, and product teams playing vital roles. We continue to invest time, resources and energy in building a multi-layered program that is constantly evolving and improving. Although each layer plays an important role, below we highlight three components that reflect our approach.

We take a layered approach to protecting people and their information - implementing many safeguards to catch bugs. Given the scale at which Facebook operates, we have invested heavily in building and deploying a wide range of automated tools that are intended to help us identify and remediate potential privacy issues as early and quickly as possible. Issues detected through these automated systems are flagged in real time to facilitate rapid response, and in some cases, can be self-remediated.

Of course, no matter how capable our automated systems become, the oversight and diligence of our employees always plays a critical role in helping to proactively identify and remediate incidents. Our engineering teams are constantly reviewing our systems to identify and fix issues before they can impact people.

Since 2011, we have operated a bug bounty program in which external researchers help improve the security and privacy of our products and systems by reporting potential security vulnerabilities to us. The program helps us scale detection efforts and fix issues faster to better protect our community, and the rewards we pay to qualifying participants encourage more high-quality security research.

Over the past 10 years, more than 50,000 researchers joined this program and around 1,500 researchers from 107 countries have been awarded bounties. A number of them have since joined Facebook’s security and engineering teams and continue this work protecting the Facebook community.

While we’ve adopted a number of protections to guard against privacy incidents like unauthorized access to data, if an issue does occur, we believe that transparency is an important way to rebuild trust in our products and processes over time. Accordingly, beyond fixing and learning from our mistakes, our Incident Management program includes steps to notify people where appropriate, such as a post in our Newsroom or our Privacy Matters blog about issues impacting our community, or working with law enforcement or other officials to address issues we find.

Third parties are external partners who do business with Facebook but aren’t owned or operated by Facebook. These third parties typically fall into two major categories: those who provide a service (like vendors who provide creative support) and those who build their businesses around our platform (like app or API developers). To mitigate privacy risks posed by those third parties that receive access to data, we developed a dedicated program that oversees the privacy risks presented by third-party access to data and implements appropriate privacy safeguards.

We have developed a third-party privacy assessment process for service providers to assess and mitigate privacy risk at Facebook. These third parties are also bound by contracts, which are expected to implement terms based on their risk tier assigned after assessment. During our engagement with a third party, their risk profile determines how they are monitored, including periodic reassessments and enforcement actions taken as a result of violations, including termination of the engagement.

We have designed a formal process for enforcing and offboarding third parties who violate our privacy or security obligations.

To support this, we have developed procedures and infrastructure designed to ensure that third-party developers complete the Data Use Checkup, an annual self-certification of the purpose for and use of each type of information that they request or continue to have access to, and that each purpose and use complies with Facebook’s Platform Terms and Developer Policies. We do not take these representations for granted.

We have also developed technical and procedural mechanisms to monitor their compliance with our platform terms on both an ongoing and periodic basis. When there is a violation, we take into account the severity, nature and impact of the violation, the subject’s malicious conduct or history of violation, and applicable law to determine what the appropriate action is to take.

We also developed data securitization standards based on principles for developers to drive better security practices across our platform and the developer ecosystem more broadly. As an example, we launched the Platform Initiatives Hub for developers to ensure they have the tools and information they need to continue to use our platform responsibly.

We have technical mechanisms in place to mitigate and prevent third parties from accessing data from Facebook through proactive and reactive measures like prevention, deterrence, detection and enforcement. While each mechanism is important, we’re highlighting a few examples of this in practice.

We have invested in infrastructure and technical tools to help monitor, detect and prevent third parties from misusing data. This includes technical mechanisms designed to prevent unauthorized access to Facebook data via scraping and other means. Examples of these investments include infrastructure which includes rate limits, limits on the amount of data a single session can access and systems to detect and respond to automation tools.

In addition to investing in technical teams and tools that monitor and detect suspicious activity, we also use other mechanisms to combat third-party data misuse. Our External Data Misuse team team investigates suspected scrapers to learn more about what they’re doing and make our systems stronger. We’ve taken a variety of actions against data misuse. These can include sending cease and desist letters, disabling accounts, filing lawsuits against scrapers engaging in egregious behavior, and requesting companies that host scraped data to take them down. This is also why it’s important for governments to do more to investigate and take action against unlawful scraping activity.

03. PRIVACY PRODUCT OUTCOMES

We strive to design products and features with privacy in mind.

Another photo of Facebook CEO Mark Zuckerberg addressing employees at an outdoor town hall

“This is going to be a major turning point for our company, and it is going to require all of your help and work to help deliver on this for the people we serve. We have a responsibility to protect people’s privacy.”


- Mark Zuckerberg, Chief Executive Officer

The accountability processes, safeguards and technical mechanisms that we’ve built help ensure that new products and features embed privacy by design. We’ve seen these updated processes enable us to improve our privacy approach in new products and features, as we pivot to respond to the world around us.

To provide greater transparency and control to people, we’ve developed a number of privacy tools, like Privacy Checkup, which houses our privacy settings all in one, centralized place, and Off-Facebook Activity, which provides a summary of information about your activity on other apps and websites and explains how that information is used (for example, to show you more relevant ads). We also give you a summary of that information and clear it from your account if you want to. We are continuously working to improve many of these tools to provide greater transparency and control to people.

We launched Access Your Information in 2018 to give people a central place to access their information on Facebook. Since then, we’ve worked to improve transparency and usability for people by reorganizing data categories into more granular and easy to understand subcategories, such as ‘Ads Information’, ‘Friends and Followers’, and ‘Apps and Websites Off of Facebook.’

We also added search functionality, so people can find data categories more easily. We also added information about how your data may be used to personalize your experience on Facebook. We made these updates to make it easier for people to access and understand their data in meaningful ways.

In 2019, we launched Manage Activity as a transparency and control tool to help people archive or delete their old posts in one, centralized place. We received feedback from privacy experts and people about the limitations around managing old posts, photos and other content in bulk. So we created an archival control for content you no longer want others to see on Facebook, but may want to keep for yourself.

Manage Activity also includes a deletion control that provides a more permanent option, so people can move old posts in bulk to the trash. After 30 days, posts sent to the trash will be deleted, unless you choose to manually delete or restore them before then.

We also understand that people want a simple way to manage lots of posts at once, so we designed Manage Activity to let you manage posts in bulk, and we created filters to help you sort and find the content you are looking for, like posts with specific people or from a specific date range.

We introduced the option to use disappearing messages on WhatsApp. When the option for disappearing messages is turned on, new messages sent to a chat are designed to disappear after a number of days, helping the conversation feel lighter and more private. In a one-to-one chat, either person can turn disappearing messages on or off. In groups, admins will have the control.

We started with seven days because we think it offers peace of mind that conversations aren’t permanent, while remaining practical so you don’t forget what you were chatting about. The shopping list or store address you received a few days ago will be there while you need it, and then disappear after you don’t.

As part of Facebook’s vision for a privacy-focused platform, we believe people’s private communications should be secure. We care deeply about providing the ability for people to communicate privately with their friends and loved ones where they have confidence that no one else can see into their conversations.

We currently provide private communication through WhatsApp and Messenger. In WhatsApp, end-to-end encryption ensures only you and the person you’re communicating with can read or listen to what is sent, and nobody in between. And in Messenger, a secret conversation is protected by end-to-end encryption and intended just for you and the person you’re talking to.

In a few years, we expect future versions of Messenger and WhatsApp to become the main ways people communicate on the Facebook network. We’re focused on making both of these apps faster, simpler, more private and more secure, including with end-to-end encryption. We plan to add more ways for people to interact privately with friends, groups and businesses, so connecting across the Facebook network will become a fundamentally more private experience.

Ensuring that encryption is implemented across our messaging services in an effective and responsible manner will require continued dialogue and collaboration with external stakeholders.

Recognizing that young people have unique privacy needs, our privacy program pays particular attention to youth privacy. Our goal is to provide services that promote the best interests of the young people who use them, in coordination and consultation with parents, regulators, policymakers and civil society experts.

We employ a number of methods to ensure people are the appropriate age to have age-appropriate experiences, and Facebook products generally have age-appropriate safeguards in place.

An example of youth privacy in action is Messenger Kids, offering an age-appropriate messaging experience for our youngest users. Built for kids it’s a slimmed down version of Messenger that doesn’t include ads or in-app purchase, and offers parental control features that let parents manage aspects of their kids' activity.

Our work to communicate transparently includes providing education to improve understanding of our practices, increasing awareness of our practices, and ensuring information is accessible and easy to find.

There are many ways we might communicate data privacy practices to people. In some instances we might communicate with people through a dedicated Privacy section of our Newsroom, where we provide more information about how we’ve approached privacy in the context of particular features or issues. In others, we might provide people with in-product notices or contextual education about our privacy controls to help them understand our data processing activities and how we use their data to inform their experience, or communicate this information through our Data Policy, which describes our privacy practices in detail.

As a part of our efforts to respond to the COVID-19 pandemic, we focused on how we could share data for good while protecting people’s privacy. We developed data sets in the form of maps of populations and movement to help inform disease forecasting efforts and protective measures during the pandemic.

To protect people’s privacy, we use technical mechanisms to mitigate re-identification risk, which include:

  • Aggregating publicly available datasets that include location information in a way that protects the people’s privacy by using techniques like spatial smoothing to create weighted averages and avoid using map tiles where very few people live
  • Applying a differential privacy framework for publicly available Movement Range Maps because they are datasets on mobility, which not only contain location data but concern movement of people over time
    • Applying a differential privacy framework that takes into account the sensitivity of the data set and adds noise proportionally to ensure with high probability that no one can re-identify people and can mitigate re-identification risk even with respect to data that we are not considering at the time of design

In addition to the safeguards applied to mitigate re-identification risk, we also leverage data use agreements to stipulate clear guidelines that ensure responsible data practices.

04. INVESTMENTS IN A TECHNICAL FOUNDATION THAT SUPPORTS PRIVACY

We’re embedding privacy into the technical fabric of our company.

While we still have a lot of work left to do, we have made meaningful progress towards our goal to embed our privacy responsibilities into our systems. Our continued technical privacy investments will ensure that we can fulfill our privacy mission to honor people’s privacy in everything we do.

Photo of two people collaborating

Building privacy into our decision-making processes is an important area of continued focus. But as Facebook grows, an important way of scaling privacy protections will be to build technical foundations that promote privacy and accountability at scale. We are creating sustainable technical solutions to meet evolving privacy expectations and ensure consistent application of our privacy requirements across our products and systems.

Creating advanced technical solutions to address privacy will require considerable effort. It’s a company-wide undertaking that will likely take years to fully accomplish, but we believe it’s an important investment in the future of privacy at Facebook.

Building technical solutions that can adapt to evolving privacy expectations first requires we complete significant underlying technical work. We began this work by improving how we manage data in its Facebook lifecycle; central to this has been our work to configure data in accordance with our deletion policies at scale.

Deletion is an important privacy expectation of people who use our applications and services. People trust that when they decide they no longer want their data to exist online, it will be deleted effectively and completely.

The current approach to data deletion across industry is an onerous one, in which developers are required to manually write repetitive code that accounts for each update or change to a product and ensures that all the deletion logic still holds up. The complex architecture of modern distributed data stores also leaves room for potential error.

We built a data deletion framework that helps alleviate the risk of potential error through machine-learning automation. Through the deletion framework, engineers annotate intended deletion behavior (e.g., “when a user deletes a post, also delete all the comments”) and our framework handles the deletions across multiple datastores and with reliability guarantees. The framework also helps engineers ensure that we address deletion early on in the product development process, before any data can be stored. While it does not yet cover all data at Facebook, our framework is already processing billions of deletions every day.

We’re also investing in privacy-enhancing technologies — technologies based on advanced cryptographic and statistical techniques that minimize the data we collect, process and share — to help protect data at different stages of the data lifecycle. And while these technologies are an important part of our work to build a technical foundation that supports privacy, we are still in the early stages of this investment and are continuing to explore their various use cases.

As an example, we’re also doing research around our use of cryptographic techniques like blind digital signatures and anonymized logging to prevent fraud in a number of use cases, including investigating crashes, assessing performance and monitoring product and advertising metrics. This privacy protective approach mitigates privacy concerns, as we are able to apply data minimization practices while simultaneously preventing fraud at scale.

We’re also experimenting with double-blind matching technology to enable matching of data records while preserving privacy. Much of the work in this domain either reveals the matched records to one or both parties or uses a complicated circuit-based construction to preserve privacy. To solve this, we’ve developed two new private matching algorithms that can work within real-world constraints, such as when an entire record is unavailable at the time of matching.

05. ONGOING COMMITMENT TO PRIVACY

We’re invested in privacy and are committed to continuous improvement.

Photo of Chief Privacy Officer of Product Michel Protti speaking at an indoor company summit

“Getting privacy right is a continual, collective investment across our company, and is the responsibility of everyone at Facebook to advance our mission.”


- Michel Protti, Chief Privacy Officer for Product

Privacy is one of the defining social issues of our time and is central to Facebook’s vision for the future. When we say privacy is everyone’s priority at Facebook, we mean it; it’s an integral part of everything we do here at Facebook from the top down to the ground up.

We know this is just the beginning. We are building a new privacy foundation to guide us now and in the future. We are committed to seeking feedback from and working with stakeholders across industry, civil society, think tanks and academia to improve our program. Our privacy work is never finished, and we understand that this commitment means continuously improving and focusing on this every day.