At a glance
- PETs are available for a variety of purposes (eg secure training of AI models, generating anonymous statistics and sharing information between different parties).
- Differential privacy generates anonymous statistics. This is usually done by randomising the computation process that adds noise to the output.
- Synthetic data provides realistic datasets in environments where access to large real datasets is not possible.
- Homomorphic encryption provides strong security and confidentiality by enabling computations on encrypted data without first decrypting it.
- Zero-knowledge proofs (ZKP) provide data minimisation by enabling someone to prove private information about themselves without revealing what it actually is.
- Trusted execution environments enhance security by enabling processing by a secure part of a computer processor that is isolated from the main operating system and other applications.
- Secure multiparty computation (SMPC) provides data minimisation and security by allowing different parties to jointly perform processing on their combined information, without any party needing to share all of its information with each of the other parties.
- Federated learning trains machine learning models in distributed settings while minimising the amount of personal information shared with each party. Using federated learning alone may not achieve appropriate protection of personal information. It may also require specific expertise to design mitigations (eg by combining with other PETs at different stages of your processing).
In detail
- Introduction
- Differential privacy
- Synthetic data
- Homomorphic encryption (HE)
- Zero-knowledge proofs
- Trusted execution environments
- Secure multiparty computation (SMPC)
- Private set intersection (PSI)
- Federated learning
- Reference table