TOP
Suche auf der Schloss Dagstuhl Webseite
Sie suchen nach Informationen auf den Webseiten der einzelnen Seminare? - Dann:
Nicht fündig geworden? - Einige unserer Dienste laufen auf separaten Webseiten mit jeweils eigener Suche. Bitte beachten Sie folgende Liste:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminare
Innerhalb dieser Seite:
Externe Seiten:
  • DOOR (zum Registrieren eines Dagstuhl Aufenthaltes)
  • DOSA (zum Beantragen künftiger Dagstuhl Seminare oder Dagstuhl Perspektiven Workshops)
Publishing
Innerhalb dieser Seite:
Externe Seiten:
dblp
Innerhalb dieser Seite:
Externe Seiten:
  • die Informatik-Bibliographiedatenbank dblp


Dagstuhl-Seminar 25112

PETs and AI: Privacy Washing and the Need for a PETs Evaluation Framework

( 09. Mar – 14. Mar, 2025 )

Permalink
Bitte benutzen Sie folgende Kurz-Url zum Verlinken dieser Seite: https://www.dagstuhl.de/25112

Organisatoren

Kontakt

Dagstuhl Seminar Wiki

Gemeinsame Dokumente

Programm
  • Upload (Use personal credentials as created in DOOR to log in)

Motivation

An increased awareness of personal data collection and of data protection regulations has contributed to the appeal of privacy enhancing technologies (PETs). The premise of PETs is that the techniques such as syntactic mechanisms for statistical disclosure control, differential privacy, homomorphic encryption and secure multiparty computation facilitate data processing while protecting individuals from unwanted disclosures. PETs have been proposed as the way to protect the functionality of artificial intelligence (AI) while protecting against these privacy attacks. This field, known as privacy-preserving machine learning (PPML), incorporates various PETs techniques at various stages of the machine learning process to (a) train over encrypted data, (b) anonymize the training process, and (c) protect the outputs using differential privacy.

Despite the abundance of works in the area of PETs, AI, and their intersection, there are many remaining challenges. Addressing these challenges is crucial to understand the drawbacks and to reap the benefits of PETs. Recent works have raised concerns about efficacy and deployment of PETs, observing that fundamental rights of people are continually being harmed, including, paradoxically, privacy. PETs have been used in surveillance applications and as a privacy washing tool.

How PETs address privacy threats needs a rethink. Protecting personal data is only a first step, and is insufficient in many cases to protect people from interference with their privacy. Furthermore, an independent evaluation framework is required to assess the level of privacy protection offered by deployed PETs solutions and to protect against privacy washing. However, most computer scientists are not well equipped to address social problems on their own. Thus, this Dagstuhl Seminar aims to bring together a group of computer science and legal scholars working on privacy and AI along with industry, policy experts, and regulators to explore the role of PETs and the challenge of private and accountable AI.

Copyright Emiliano De Cristofaro, Kris Shrishak, Thorsten Strufe, and Carmela Troncoso

Teilnehmer

Please log in to DOOR to see more details.

  • Frederik Armknecht
  • Aurélien Bellet
  • Robin Berjon
  • Asia Biega
  • Paul Comerford
  • Ana-Maria Cretu
  • Emiliano De Cristofaro
  • Yves-Alexandre de Montjoye
  • Sébastien Gambs
  • Georgi Ganev
  • Patricia Guerra-Balboa
  • Johanna Gunawan
  • Seda F. Gürses
  • Bailey Kacsmar
  • Olya Ohrimenko
  • Lucy Qin
  • Phillip Rogaway
  • Reza Shokri
  • Kris Shrishak
  • Thorsten Strufe
  • Hinako Sugiyama
  • Vanessa Teague
  • Carmela Troncoso
  • Michael Veale

Klassifikation
  • Computers and Society
  • Cryptography and Security
  • Machine Learning

Schlagworte
  • Privacy
  • Privacy Enhancing Technologies
  • Machine Learning
  • Artificial Intelligence
  • Interdisciplinary