Dagstuhl Seminar 25112
PETs and AI: Privacy Washing and the Need for a PETs Evaluation Framework
( Mar 09 – Mar 14, 2025 )
Permalink
Organizers
- Emiliano De Cristofaro (University of California - Riverside, US)
- Kris Shrishak (Irish Council for Civil Liberties - Dublin, IE)
- Thorsten Strufe (KIT - Karlsruher Institut für Technologie, DE)
- Carmela Troncoso (EPFL - Lausanne, CH)
Contact
- Michael Gerke (for scientific matters)
- Simone Schilke (for administrative matters)
An increased awareness of personal data collection and of data protection regulations has contributed to the appeal of privacy enhancing technologies (PETs). The premise of PETs is that the techniques such as syntactic mechanisms for statistical disclosure control, differential privacy, homomorphic encryption and secure multiparty computation facilitate data processing while protecting individuals from unwanted disclosures. PETs have been proposed as the way to protect the functionality of artificial intelligence (AI) while protecting against these privacy attacks. This field, known as privacy-preserving machine learning (PPML), incorporates various PETs techniques at various stages of the machine learning process to (a) train over encrypted data, (b) anonymize the training process, and (c) protect the outputs using differential privacy.
Despite the abundance of works in the area of PETs, AI, and their intersection, there are many remaining challenges. Addressing these challenges is crucial to understand the drawbacks and to reap the benefits of PETs. Recent works have raised concerns about efficacy and deployment of PETs, observing that fundamental rights of people are continually being harmed, including, paradoxically, privacy. PETs have been used in surveillance applications and as a privacy washing tool.
How PETs address privacy threats needs a rethink. Protecting personal data is only a first step, and is insufficient in many cases to protect people from interference with their privacy. Furthermore, an independent evaluation framework is required to assess the level of privacy protection offered by deployed PETs solutions and to protect against privacy washing. However, most computer scientists are not well equipped to address social problems on their own. Thus, this Dagstuhl Seminar aims to bring together a group of computer science and legal scholars working on privacy and AI along with industry, policy experts, and regulators to explore the role of PETs and the challenge of private and accountable AI.
Classification
- Computers and Society
- Cryptography and Security
- Machine Learning
Keywords
- Privacy
- Privacy Enhancing Technologies
- Machine Learning
- Artificial Intelligence
- Interdisciplinary