Dagstuhl Seminar 25261
Future of Human-Centered Privacy
( Jun 22 – Jun 27, 2025 )
Permalink
Organizers
- Zinaida Benenson (Universität Erlangen-Nürnberg, DE)
- Simone Fischer-Hübner (Karlstad University, SE)
- Heather Richter Lipford (University of North Carolina - Charlotte, US)
- William Seymour (King's College London, GB)
Contact
- Marsha Kleinbauer (for scientific matters)
- Christina Schwarz (for administrative matters)
Human-centered privacy resides in the intersection of privacy and human-computer interaction (HCI) research. It investigates users' privacy perceptions, concerns, and awareness in various settings, and also the understanding, usefulness, and usage of various privacy-enhancing technologies. On the one hand, the advance of Internet of Things, smart spaces, and AI have raised new questions that need to be investigated, e.g., how to negotiate privacy settings in the presence of different users of the same system, or how to improve the transparency of AI systems. On the other hand, there are many questions that have been explored for decades, but need to be adapted to these new areas and domains, such as "What is a privacy decision?" and "What information do users need to make a privacy decision?". Moreover, the multitude of users also includes at-risk and vulnerable populations that interact (sometimes unwillingly or unknowingly) with digital systems, and require additional research to understand their needs.
Four primary topics have been identified for discussion at this Dagstuhl Seminar, and the participants will be free to adjust them or to define additional ones.
- Inclusive Privacy has long been a focus in privacy research, centring on challenges that disparately impact particular groups within society. These groups contend with additional barriers that challenge their ability to have privacy compared to others, and there is increasingly mainstream recognition that (1) communities experience differential impact from online threats, with seemingly innocuous technologies having (negative) life changing impact for some communities; (2) generic privacy designs do not adequately serve all individuals and communities across their diversity of identities and cultural backgrounds; and (3) while online privacy needs to better fit the needs of diverse individuals, inclusive privacy research faces numerous access and methodological challenges.
- Multiuser Privacy for smart devices and smart spaces involves the preferences of many users whose preferences need to be recognised and negotiated. Yet, the technological make-up of many current digital systems still assumes that there is only one system owner, and often also only one account per system. Moreover, the complexity and "messiness" of modern environments, which often consist of devices bought at different times, with different technical possibilities to cooperate with already existing devices, make environments such as smart homes technically challenging.
- Privacy and AI remains a significant concern from the public, even though it has not yet been the primary focus of research. Key usable privacy challenges for AI include those related to the trustworthy AI principles for human empowerment and user control, transparency, and explainability. On the other hand, AI can also be utilised for enhancing usable privacy in a trustworthy and GDPR-compliant manner, including personalised AI-based systems for making privacy notices more accessible and usable for a diverse group of users and/or for supporting them in making privacy decisions matching their preferences and needs. In this context, the usability challenge that users tend to "overtrust" AI systems and attribute human characteristics with them ("ELIZA effect") also needs to be considered for achieving trustworthy usable privacy.
- Privacy Communication forms the heart of individuals' right to make their own decisions regarding the disclosure and use of their personal data. Exercising the right to informational self-determination requires that individuals are well informed about the extent, purposes, and consequences of the processing of their personal data, but users are often "tricked" (e.g., via dark patterns) to disclose more personal data than intended. In general, it remains a challenge to enforce usable transparency and control in practice due to several HCI-related factors, including the factor that privacy is usually only the users' secondary goal, the problem of cognitive overload resulting in habituation, or bounded rationality, behavioural biases, and the information asymmetry of users compared to service providers that makes rational choices difficult for users.
This seminar will serve as a platform for developing an interdisciplinary understanding of the above topics. One desired outcome is a roadmap for human-centered privacy that will be broadly disseminated and published. Additionally, we aim at fostering multidisciplinary collaborations, such as joint publications and projects.
Classification
- Computers and Society
- Cryptography and Security
- Human-Computer Interaction
Keywords
- inclusive privacy
- multiuser privacy
- privacy and AI
- privacy decisions