Skip to main content

Reference table

Contents

The table below provides some example use-case applications for PETs discussed in this guidance, together with information about whether standards are available and known limitations. Your purposes may require a combination of techniques to provide the required protection at all the various stages of the data processing lifecycle. This is not an exhaustive list.

PET Applications Standards Known weaknesses How does it support data protection compliance?
Secure multiparty computation

Cryptographic key protection within a single organisation: Secure multiparty computation allows an organisation to split its secret keys across multiple hosts.

Pseudonymisation within a single organisation.

Privacy-preserving analytics (eg training neural networks, evaluating decision trees).

Secure collaborative computation (eg processing that requires multiple parties to share information between them for joint analysis of the combined data).

Can be used to speed up the validation process for AI models.

IEEE 2842-2021 – IEEE Recommended practice for secure multi-party computation.

ITU-T X.1770 Technical guidelines for secure multi-party computation.

The IETF is currently developing a draft multi-party privacy-preserving measurement (PPM) protocol standard.

Requires significant computational resources. Communication costs can be high.

Data minimisation

Security

Homomorphic encryption

Leverage cloud computing and storage services securely, as information held off-site is encrypted but can be processed.

Secure machine learning as a service: information can be processed without giving processor access to encrypted information.

Secure collaborative computation.

Community

standard for homomorphic encryption.

ISO/IEC 18033-6:2019 - IT Security techniques — Encryption algorithms — Part 6: Homomorphic encryption

Also in development:

ISO/IEC WD 18033-8 - Information security — Encryption algorithms — Part 8: Fully Homomorphic Encryption

Scalability and computation speed can be an issue.

Fully homomorphic encryption is unsuitable for real-time information analysis.

Accuracy

Security

Differential privacy

Performing statistical analysis with privacy guarantees (ie that presence or absence of an person in the information does not affect the final output of the algorithm significantly).

Useful for allowing databases to be queried without releasing information about people in the database.

No standard available. No consensus over the optimal trade-off of privacy and utility. The level of noise added will depend on the circumstances of the processing.

Reduce identifiability of personal information or render it as anonymous information

Purpose limitation

Zero-knowledge proofs Proving claims about personal information (eg nationality, solvency, age, transactions).

ZKProof
Community
Reference (2019)

ISO/IEC 9798-5:2009 - Information technology — Security techniques — Entity authentication — Part 5: Mechanisms using zero-knowledge techniques

Weaknesses in Zero-knowledge proof implementations can be caused by poor implementation of the protocol.

Interactive protocols may be more vulnerable to side channel or timing attacks as they require the prover to send multiple messages.

Data minimisation

Security

Generating synthetic data Use cases that require access to large amounts of information (eg model training, research and development). No standard available, although work on this topic is underway in IEEE SA - Synthetic data

Synthetic data may not represent outliers present in the original personal information.

Requires assessment on whether the personal information on which the synthetic data was trained can be reconstructed.

Further additional measures (eg differential privacy) may be required to protect against singling out.

Data minimisation

Purpose limitation

Federated learning Applications where the aggregation of information into a centralised data server is not feasible or desirable (eg building models of user behaviour from device data, without it leaving the devices or carrying out research on information from multiple entities without it being transmitted between them). IEEE 3652.1-2020 – IEEE Guide for architectural framework and application of federated machine learning.

The devices or entities contributing information need to have compatible formats and standards to allow the analysis to be carried out locally. This also requires sufficient local computing power.

Federated learning requires frequent communication between the participating entities. This requires sufficient bandwidth.

Requires other PETs to provide privacy to people’s information. This may affect performance and scalability.

Data minimisation

Security (when combined with other PETs)

Trusted execution environments

Protection against software attacks.

Used for processing, particularly confidential information within an existing system or device.

IEEE SA - IEEE 2830-2021, IEEE SA - IEEE 2952-2023 and NISTIR 8320, Hardware-Enabled Security: Cloud and Edge Computing Use Cases | CSRC

In development:

Other standardisation initiatives are being developed by Trusted Execution Environment Provisioning (teep) (ietf.org)
GlobalPlatform, the Trusted Computing Group and Confidential Computing
Consortium

May be vulnerable to side channel attacks.

These attacks monitor certain properties of the system, such as the time required to perform an operation, to learn sensitive information.

Accountability

Security