TOP
Search the Dagstuhl Website
Looking for information on the websites of the individual seminars? - Then please:
Not found what you are looking for? - Some of our services have separate websites, each with its own search option. Please check the following list:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminars
Within this website:
External resources:
  • DOOR (for registering your stay at Dagstuhl)
  • DOSA (for proposing future Dagstuhl Seminars or Dagstuhl Perspectives Workshops)
Publishing
Within this website:
External resources:
dblp
Within this website:
External resources:
  • the dblp Computer Science Bibliography


Dagstuhl Seminar 25112

PETs and AI: Privacy Washing and the Need for a PETs Evaluation Framework

( Mar 09 – Mar 14, 2025 )

Permalink
Please use the following short url to reference this page: https://www.dagstuhl.de/25112

Organizers

Contact

Motivation

An increased awareness of personal data collection and of data protection regulations has contributed to the appeal of privacy enhancing technologies (PETs). The premise of PETs is that the techniques such as syntactic mechanisms for statistical disclosure control, differential privacy, homomorphic encryption and secure multiparty computation facilitate data processing while protecting individuals from unwanted disclosures. PETs have been proposed as the way to protect the functionality of artificial intelligence (AI) while protecting against these privacy attacks. This field, known as privacy-preserving machine learning (PPML), incorporates various PETs techniques at various stages of the machine learning process to (a) train over encrypted data, (b) anonymize the training process, and (c) protect the outputs using differential privacy.

Despite the abundance of works in the area of PETs, AI, and their intersection, there are many remaining challenges. Addressing these challenges is crucial to understand the drawbacks and to reap the benefits of PETs. Recent works have raised concerns about efficacy and deployment of PETs, observing that fundamental rights of people are continually being harmed, including, paradoxically, privacy. PETs have been used in surveillance applications and as a privacy washing tool.

How PETs address privacy threats needs a rethink. Protecting personal data is only a first step, and is insufficient in many cases to protect people from interference with their privacy. Furthermore, an independent evaluation framework is required to assess the level of privacy protection offered by deployed PETs solutions and to protect against privacy washing. However, most computer scientists are not well equipped to address social problems on their own. Thus, this Dagstuhl Seminar aims to bring together a group of computer science and legal scholars working on privacy and AI along with industry, policy experts, and regulators to explore the role of PETs and the challenge of private and accountable AI.

Copyright Emiliano De Cristofaro, Kris Shrishak, Thorsten Strufe, and Carmela Troncoso

Classification
  • Computers and Society
  • Cryptography and Security
  • Machine Learning

Keywords
  • Privacy
  • Privacy Enhancing Technologies
  • Machine Learning
  • Artificial Intelligence
  • Interdisciplinary