Privacy Enhancing Technologies: protecting privacy in the age of big data

By Peter Fleischer, Global Privacy Counsel, & Rafaela Nicolazzi, Government Affairs & Public Policy Manager, Google

The world is going digital. And this digital transformation will accelerate in the years ahead. Every aspect of our lives is going online, for us all, consumers, businesses, education, governments. Data generated by this transformation is increasing accordingly. At the same time, privacy remains a fundamental human right. The digital transformation must evolve with strong privacy protections, and privacy laws around the world require that citizens know and control how their personal information is collected, used and shared.

Technology is a key solution to ensuring that privacy is protected in this digital transformation. Ofcourse, technology has created privacy challenges. But technology can also be a strong tool to protect privacy. Inpaicular, new and emerging privacy-preserving technologies o‰ere‰ective ways to safe guard and enhance individual privacy while still allowing society to  unlock the immense be nets that can be obtained from the increased use and study of data.

The terms Privacy-Enhancing Technologies(PETs) or Privacy-Preserving Technologies(PPTs) encompass a wide range of applications that can help developers and researchers analyze data without revealing personal information. Through the use of PETs, companies, researchers, and governments are developing meaningful, useful insights and services while protecting individual privacy. There are many PETs, and more will be invented over time. We’re highlighting below three techniques that have been provinge ‰ective on minimizing data footprint, de-identifying data and restricting access to data.

Federated Learning is a data minimization technology that trains machine learning models, without any raw data leaving the device. In other words, it enables the training of machine learning models while keeping all the raw input data on the client. Federated Learning is an example of how privacy technology be nets both organizations and end users is available for researchers to experiment with via TensorFlow1 Federated, an open-source framework for machine learning and other computations on decentralized data. It means organizations across the globe can leverage this powe½ulyet private technology to enhance their products and services.

Differential Privacy is an advanced anonymization technology that allows organizations to gain insights from data without compromising user anonymity. It protects data by introducing just the right amount of “noise” to the data, using advanced mathematical algorithms. It means people can safely use large, aggregated data sets to inform research and business decisions, without revealing people’s individual data. It’s being used in a wide range of use cases, such as researching new medical treatments using medical records without compromising patient privacy, or studying population movements in response to Covid-driven restrictions without tracking any particular individual.

Private Join and Compute or Secure Multi-Pay Computing is a cryptographic protocol that helps two or more organizations run aggregated computations on sensitive data they  would not normally share with others. It helps solve real-world problems in business, research, social science, and more. This form of multi-pay computation is completely private. It builds upon two fundamental techniques to ensure nothing but the size of the joined set and statistics of its associated values is revealed. Financial modeling is one of the possibilities where this technology can be deployed: Bank A doesn’t want to share data with BankB, because it may contain proprietary insights, not to mention the fact that BankA is obligated to protect its customers’ personal information. Yet there is value in sharing the data to gain new insights into nancial products. To protect the proprietary data of both paies, Private Join and Compute is employed.

Privacy-enhancing technologies are tools for privacy by design, and great examples of privacy through innovation. PETs can complement robust privacy laws and regulatory oversight around the world. PETs have the potential to greatly be net everyone, by allowing individuals, governments, non-prots, and businesses to use and obtain valuable insights from data whilst maintaining individual privacy.

We foresee aviual circle, as organizations, researchers, and policy makers suppo the evolution of PETs. Organizations should continue to approach privacy as a challenge for innovation. Researchers are bringing their  technology expeise to develop and rene these  techniques. And policy makers can suppo and encourage the use and adoption of PETs. To  highlight a couple examples: the US and UK recently announced a prize challenge of 1.6M USD for federated learning innovations, and the European Data Protection Supervisor recently indicated Federated Learning as one of the top 5 foreseen trends for 2022/2023 due to its potential for machine-learning models that preserve privacy rights.

At   Google, we’re  investing in the development and democratization of PETs:

https://developers.googleblog.com/2022/12/new-privacy-enhancing-technology-for-everyone.html With more collective investment in the development and adoption of PETs, we see a future where we all get the best of both worlds: learning from data to solve the world’s problems, and preserving individual privacy.

Hot Topics

Related Articles