Dagstuhl-Seminar 22042
Privacy Protection of Automated and Self-Driving Vehicles
( 23. Jan – 28. Jan, 2022 )
Permalink
Organisatoren
- Frank Kargl (Universität Ulm, DE)
- Ioannis Krontiris (Huawei Technologies - München, DE)
- André Weimerskirch (Lear Corporation - Ann Arbor, US)
- Ian Williams (University of Michigan - Ann Arbor, US)
Kontakt
- Michael Gerke (für wissenschaftliche Fragen)
- Jutka Gasiorowski (für administrative Fragen)
Programm
Cooperative, connected and automated mobility (CCAM) has the potential to drastically reduce accidents, travel time, and the environmental impact of road travel. To achieve their goals, connected and automated vehicles (AVs) require extensive data and machine learning algorithms for processing data received from local sensors, other cars, and road-side infrastructure. This immediately raises the question of privacy and data protection. While privacy for connected vehicles has been considered for many years, AV technology is still in its infancy and the privacy and data protection aspects for AVs are not well addressed. The capabilities of AVs pose new challenges to privacy protection, given that AVs have large sensor arrays that collect data in public spaces. Additionally, AVs capture data not only from other vehicles, but also from many other parties (i.e. pedestrians walking along a street) with very limited possibilities to offer notice and choice about data processing policies. Additionally, the driver will not necessarily be the owner of the vehicle and it may be the case that the majority of AVs are owned by fleets.
Our seminar reviewed existing technologies, standards, tools, and frameworks for protecting personal information in CCAM, investigated where such existing techniques clash with the requirements of an AV and its data processing, and identified gaps and road-blockers that need to be addressed on the way to deployment of privacy protection in AVs from a legal, technical, and ethical perspective. While we ran only a shortened online version of the originally planned seminar due to COVID pandemic limitations, we made very good progress, in particular towards identifying and structuring the challenges. Future meetings will build on the results and will discuss the different challenges in more depth, prioritize the corresponding road blockers, and push for research to overcome them.
Discussions during the seminar were organized in seven sessions with presentations from renowned experts from industry and academia, and a final discussion that collected and structured outcomes. In the concluding session, we identified four main challenges that we present in this report alongside the talk abstracts.
- The first challenge is ethics and responsible behavior of companies and other actors that collect and process personal data in such systems. This goes beyond mere regulatory compliance but was seen as a promising path to complement this minimal baseline. Further discussions are required to identify ways to encourage such practices.
- Second, we discussed how regulation needs to evolve for future CCAM systems in order to establish a stable baseline. A challenge here will be to identify to what extent sector-specific regulation will be needed to address specifics of CCAM and if regulation of future systems is reasonable and possible.
- A third challenge is the commercial environment. Industry has to meet regulations and financial expectations and sometimes even conflicting goals like privacy and safety. Understanding and narrowing these trade-offs while acknowledging that industry has many such constraints that limit its flexibility requires further investigation.
- Last but not least, we see a strong progress in the privacy-enhancing technology (PET) as a promising path towards resolving many of the above mentioned problems. At the same time, many PETs have not been designed for the CCAM domain and might not meet its demands in data quality or latency. For this reason, we see the need to further investigate how existing PETs meet CCAM requirements or how they can be developed further to do so.
Generally speaking, there is a lack of incentives for enterprises like original equipment manufacturers (OEMs) to go beyond the legal minimum requirements to manage personal data in a privacy-respecting manner, to design privacy-preserving products, or to make the use of personal data transparent to the data subject. During our discussions one question became prominent: What could be the motivation for OEMs to do more in the field of data protection that goes beyond the bare minimum of legal compliance? Ethical and trustworthy aspects, as well as reputation and brand image could be worth investigating in answering this question. However, the field is massively interdisciplinary making it necessary to convince other involved disciplines of the value of data protection for the automotive sector.
There are several technical solutions available for protecting privacy and facilitating the privacy-by-design approach. However, the up-scaling of these solutions to larger systems and their integration with existing systems often fails because systems aspects and the related interdisciplinary issues are not taken into account. So, further progress is needed in promoting privacy-friendly system engineering, as well as integrating PETs into complete systems, taking into consideration the special requirements of safety and trust in the automotive domain. Overall, there should be a push for joint efforts to define and deploy technologies that are superior to today's solutions and that are commercially feasible since cost and effort are split amongst many participants.
Further progress is also required for the development of best practices, methodologies, and a requirements standard similar to ISO 21434 that supports the engineering of practical privacy solutions in complex systems. This will give OEMs a proper threshold target and allow for efficient solution finding and re-use. That guidance or standard could be a layer on top of regulation, similar to how the UN ECE R155 regulation requires a Cybersecurity Management System (CSMS) for which the ISO 21434 standard defines process requirements.
Automated and autonomous vehicles (AVs) may be the greatest disruptive innovation to travel that we have experienced in a century. Their development coincides with the appearance of connected vehicles. To achieve their goals, connected and automated vehicles require extensive data and machine learning algorithms processing data from local sensors and received from other cars and road-side infrastructure for their decision-making. Specifically, we are seeing the emergence of vehicles that feature an impressive array of sensors and on-board decision-making units capable of coping with an unprecedented amount of data.
While privacy for connected vehicles has been considered for many years, AV technology is still in its infancy and the privacy and data protection aspects for AV are not well addressed. The capabilities of AVs pose new challenges to privacy protection, given the large sensor arrays of AVs that collect data in public spaces. The massive introduction of sensors and AI technology into automated and autonomous vehicles opens up substantial new privacy and data protection problems, both from the technology research perspective as well as the legal and policy perspective, which still need to be phrased clearly, elaborated on and resolved.
The goal of this Dagstuhl Seminar is twofold:
First, to bring legal and technology experts (for both privacy and automated driving) together to allow an informed and open discussion between those often disjoined groups. Only such a discussion will provide a strong basis for mutual understanding and for achieving the second goal.
This second goal is to produce a scientific roadmap that evaluates where technology, as currently developed and foreseen, falls short of legal data protection requirements and identify directions how these could be met by adjusting the way technology develops, for example by developing and integrating new privacy-enhancing technologies. This could, for example, focus on areas like:
- Establishment of Trust: A promising research direction is to investigate the integration of trusted computing technologies and (partially) shifting trust from the back-end infrastructure to the edge (i.e., vehicles). Another development we aim to discuss is how to leverage software-based Trusted Execution Environments (TEEs) in order to confine processing of personal data within a secure enclave that is verified by remote attestation to be in a certified process that will not process personal data outside of the declared purpose.
- Advanced AI Techniques. We identify two examples here that we want to analyze: We first want to investigate advanced AI techniques like federated learning, which enables developers to train based on the shared models on their decentralized devices or servers with the local dataset. A second topic concerns anonymization of video recorded by vehicle cameras. A solution to respect privacy is to anonymize the recorded data immediately, by e.g. blurring faces and licence plates. However, applying these techniques in the training set can impact the environment detection quality of vehicles at various degrees, which is something that is not well understood so far.
One question is whether technical solutions can satisfy the variety of international legal requirements. The other question is whether such solutions are sufficiently mature and reliable to be integrated into a self-driving car.
So, we will also address the question if these legal frameworks are sufficiently forward looking to cover the new challenges created by AV and intelligent transportation or whether they would - in the worst case - even block technological progress by not allowing, for example, efficient collection of training material for machine learning systems. We also ask if the variety of data protection regimes worldwide would be a hindrance and harmonization should be sought.
- Ala'a Al-Momani (Universität Ulm, DE)
- Ines Ben Jemaa (IRT SystemX - Palaiseau, FR)
- Benedikt Brecht (Volkswagen AG - Berlin, DE)
- Michael Buchholz (Universität Ulm, DE)
- Thanassis Giannetsos (UBITECH Ltd. - Athens, GR)
- Dorothy J. Glancy (Santa Clara University, US)
- Adam Henschke (University of Twente, NL)
- Mario Hoffmann (Continental Teves - Frankfurt-Sossenheim, DE)
- Frank Kargl (Universität Ulm, DE) [dblp]
- Alexander Kiening (Denso Automotive - Eching, DE)
- Ioannis Krontiris (Huawei Technologies - München, DE) [dblp]
- Jason Millar (University of Ottawa, CA)
- Kyriaki Noussia (University of Reading, GB)
- Christos Papadopoulos (University of Memphis, US)
- Jonathan Petit (Qualcomm, US) [dblp]
- Chrysi Sakellari (Toyota Motor Europe - Brussels, BE)
- Oyunchimeg Shagdar (VEDECOM - Versailles, FR)
- Yu Shang (Huawei Technologies- Shanghai, CN)
- Lauren Smith (Cruise - Washington, US)
- Karlyn D. Stanley (RAND - Arlington, US)
- Rejani Syamala (Stellantis - Troy, US)
- Natasa Trkulja (Universität Ulm, DE)
- Jessica Uguccioni (Law Commission of England and Wale - London, GB)
- Bryant Walker Smith (University of South Carolina, US)
- André Weimerskirch (Lear Corporation - Ann Arbor, US)
- Ian Williams (University of Michigan - Ann Arbor, US)
- Harald Zwingelberg (ULD SH - Kiel, DE)
Verwandte Seminare
- Dagstuhl-Seminar 23242: Privacy Protection of Automated and Self-Driving Vehicles (2023-06-11 - 2023-06-16) (Details)
Klassifikation
- Computers and Society
- Cryptography and Security
- Emerging Technologies
Schlagworte
- Privacy and Data Protection
- Automotive Security and Privacy