Dagstuhl Perspectives Workshop 14401
Privacy and Security in an Age of Surveillance
( Sep 28 – Oct 02, 2014 )
Permalink
Organizers
- Matt Blaze (University of Pennsylvania, US)
- Bart Preneel (KU Leuven, BE)
- Phillip Rogaway (University of California - Davis, US)
- Mark D. Ryan (University of Birmingham, GB)
- Peter Y. A. Ryan (University of Luxembourg, LU)
Contact
- Annette Beyer (for administrative matters)
Publications
- Privacy and Security in an Age of Surveillance (Dagstuhl Perspectives Workshop 14401). Bart Preneel, Phillip Rogaway, Mark D. Ryan, and Peter Y. A. Ryan. In Dagstuhl Reports, Volume 4, Issue 9, pp. 106-123, Schloss Dagstuhl - Leibniz-Zentrum für Informatik (2015)
- Privacy and Security in an Age of Surveillance (Dagstuhl Perspectives Workshop 14401). Bart Preneel, Philipp Rogaway, Mark D. Ryan, and Peter Y. A. Ryan. In Dagstuhl Manifestos, Volume 5, Issue 1, pp. 25-37, Schloss Dagstuhl - Leibniz-Zentrum für Informatik (2015)
The Snowden revelations have demonstrated that the US and other nations are amassing data about the minutiae of the daily lives of all citizens on an unprecedented scale. The data includes all forms of electronic communications among people, as well as web accesses, financial data, and the physical movements of people through cell-phone location tracking. The data is collected in numerous ways, using active as well as passive measures. Internet and telecommunication companies contribute their customers' data to the NSA, via programmes including PRISM and Tempora. Additionally, the NSA and GCHQ have, allegedly, covertly weakened the encryption implementations in commercial software products and international standards - for example, by weakening the randomness of generated keys - in order to gain access to still more data.
Nevertheless, intelligence services perform an important role in protecting democratic societies against the threats posed by criminal and terrorist activities. Indeed protecting citizens from harm is the first duty of government. In any society, individuals have to be accountable to society as a whole. Privacy is not an absolute right, but has to be balanced against other requirements, including security. But the scope of surveillance must be limited by an understanding of its costs as well as benefits. As technology continues to mediate in all aspects of our lives, it becomes vital to identify principles about when data may be gathered and what it may be used for. This is a task requiring sociologists, political scientists, and computer scientists.
Thus, a tension exists between the privacy rights of the individual and the security of society as a whole; establishing and maintaining the right balance between these is a major challenge. The activities of intelligence services cannot be fully transparent, and this makes it challenging to find mechanisms for oversight that provide sufficient public assurance. The data generated by our online lives is also valuable to commercial organisations, including the companies that directly collect it. Here again there is a balance to be struck between an individual's need for privacy, her wish for functionality, her limited ability to understand and make decisions in this space, and businesses' desire to sell products and services.
A major challenge here is to find ways to ensure that the intelligence services are acting within agreed laws and regulations without revealing the exact details of their activities and capabilities. At first glance this seems like a classic problem addressed using techniques from "modern cryptography"; such techniques include secret sharing, key escrow, private information retrieval, secure multiparty computation, and private, outsourced computation. For example, zero-knowledge protocols are frequently used to ensure that agents are obeying the rules of a protocol but without revealing their exact behaviour. It is not clear however that such techniques transfer cleanly to the problems at hand: for example, the notion of "correct" behaviour for intelligence agencies seems harder to characterise precisely. Exploring the boundary of what aspects of this problem can be solved by technical means and which require procedural, legal means etc. is a key theme of this workshop.
In most countries, the legal frameworks for dealing with privacy and intelligence seem to be arcane, out-of-date, and largely rooted in historical accident. They were made for a different era when we simply didn't have the problem we are facing today. The interaction of new technology, the law, and fundamental human values makes the topic difficult and interdisciplinary.
The workshop aims to address the following questions.
- Principles
- What principles underlie policy about what data is gathered and how it should be used? How can individual privacy be respected alongside the needs of societal security and commerce?
- In what ways should these questions be framed so that security is not the inevitable winner in matters that pitch personal privacy against state security or financial growth?
- What are the limits on the intrusions that nation states should be allowed to make on their citizens?
- Technology
- In what ways can technology contribute to finding ways of reconciling or balancing opposing requirements?
- What are the limits of what can be achieved used technology and what has to be handled using procedural and legal means?
- Are the cryptographic tools that have been developed to date solving the "right" problems?
- Is there practicable privacy-preserving technology that is not being deployed?
- Can we develop computing technology that better resists being subverted by powerful attackers such as nation states?
- Business
- In what ways can the data interests of commerce be addressed without at the same time allowing companies' unfettered access to the data generated by their customers?
- Law
- What can be done to modernize and harmonize the law so that nations respect the privacy rights of non-nationals and governments cannot circumvent privacy through arrangements with foreign intelligence agencies?
Revelations over the last few years have made clear that the world's intelligence agencies surveil essentially everyone, recording and analyzing who you call, what you do on the web, what you store in the cloud, where you travel, and more. Furthermore, we have learnt that intelligence agencies intentionally subvert security protocols. They tap undersea cables. They install malware on an enormous number targets worldwide. They use active attacks to undermine our network infrastructure. And they use sophisticated analysis tools to profile individuals and groups.
While we still understand relatively little about who is doing what, the documents leaked by Snowden have led to the conclusion that the Five Eyes organizations are going far beyond anything necessary or proportionate for carrying legitimate intelligence activities. ot an equivalent access to documents Governmental assurances of oversight have come to ring hollow, as any oversight to date seems to have been ineffectual, and is perhaps a complete sham.
Can democracy or nonconformity survive if the words and deeds of citizens are to be obsessively observed by governments and their machines? The rise of electronic surveillance thus raises questions of immense significance to modern society. There is an inherent tension. Machine-monitored surveillance of essentially everything people do is now possible. And there are potential economic, political, and safety benefits that power may reap if it can implement effective population-wide surveillance. But there is also a human, social, economic, and political harm that can spring from the very same activity.
The goal of our workshop was to gather together a mix of people with knowledge and expertise in both the legal and technological aspects of privacy and surveillance, to try to understand the landscape that we now live in, and to debate approaches to moving forward. We invited people from a wide range of domains, including members of the intelligence community. All invitees in the intelligence community declined the invitations - in most cases choosing not even to reply. Also, we found that we had more success in getting positive replies from members of the technical community than members of the legal or regulatory communities. Consequently, the makeup of the workshop was not as diverse and balanced as we had hoped. Nonetheless, we felt that we achieved a healthy mix, and there was plenty of lively debate. The issues addressed by this workshop were unusually contentious, and discussions at times were highly animated, even heated.
It is often argued that privacy is not an absolute right. This is true, but this is also true of other rights. The right to freedom must be tempered by the fact that people who are convicted of crimes may forfeit this right for a period. Equally, someone for whom there are sound grounds for suspicion might forfeit some privacy rights. But in any event, any such breaches must be targeted and proportionate and justified by well-founded grounds for suspicion.
An important observation that came up repeatedly in discussions is that privacy is not just an individual right but essential to the health of a democratic society as a whole.
How can society as whole be provided strong assurance that intelligence services are "playing by the rules" while at the same time allowing them sufficient secrecy to fulfill their role? It seems feasible that technical mechanisms can contribute to solving this problem, and indeed a number of presentations addressed aspects of it. One might imagine that something analogous to the notion of zero-knowledge proofs might help demonstrate that intelligence agencies are following appropriate rules while not revealing details of those activities. Another possibility that was proposed is to make the amount of surveillance public in a verifiable fashion but without revealing the targets. Thus one might imagine that a specified limit be placed on the proportion of traffic available to intelligence services. The effect would be to force the agencies to be correspondingly selective in their choice of targets.
The crypto and security community should invest a substantial effort to make all layers of the internet and our devices more secure and to strengthen the level of privacy offered. This may create a natural barrier to mass surveillance and will also bring a more robust network infrastructure to a society that is increasingly reliant on it for critical services. Such a development may eventually increase the cost for targeted surveillance, but there is no indication that this would become prohibitive.
As is traditional for Dagstuhl, we started with a round table of quick introductions from the participants, including brief statements of what they hoped to get out of the workshop. We then had an open discussion on the goals of the workshop and of how best to organise the workshop to achieve these goals. It was decided to structure discussions into three strands:
- Principles
- Research directions
- Strategy
The outcomes of these discussions are detailed in a separate "Manifesto" document. The workshop was then structured into a number of plenary sessions alternating with breakouts into the three strands. The plenary sessions were made up of presentations from participants and feedback from the breakouts followed by discussion.
The problems addressed in this workshop are immensely challenging, and carry vast implications for society as a whole. It would not be reasonable to expect a small group of people - and a group not particularly representative of society as a whole - to produce solutions in the course of four days. Our goal was to gain some understanding of guiding principles and ways forward.
- Jacob Appelbaum (The Tor Project - Cambridge, US) [dblp]
- Michael Backes (Universität des Saarlandes, DE) [dblp]
- Daniel J. Bernstein (University of Illinois - Chicago, US) [dblp]
- Caspar Bowden
- Jon Callas (Silent Circle - San Jose, US) [dblp]
- Joseph Cannataci (University of Malta, MT & University of Groningen, NL) [dblp]
- George Danezis (University College London, GB) [dblp]
- Pooya Farshim (RHUL - London, GB) [dblp]
- Joan Feigenbaum (Yale University, US) [dblp]
- Ian Goldberg (University of Waterloo, CA) [dblp]
- Christian Grothoff (TU München, DE) [dblp]
- Marit Hansen (ULD SH - Kiel, DE) [dblp]
- Amir Herzberg (Bar-Ilan University - Ramat Gan, IL) [dblp]
- Eleni Kosta (Tilburg University, NL) [dblp]
- Hugo Krawczyk (IBM TJ Watson Research Center - Hawthorne, US) [dblp]
- Susan Landau (Worcester Polytechnic Institute, US) [dblp]
- Tanja Lange (TU Eindhoven, NL) [dblp]
- Kevin S. McCurley (Google - San Jose, US) [dblp]
- David Naccache (ENS - Paris, FR) [dblp]
- Kenneth G. Paterson (Royal Holloway University of London, GB) [dblp]
- Bart Preneel (KU Leuven, BE) [dblp]
- Charles Raab (University of Edinburgh, GB) [dblp]
- Phillip Rogaway (University of California - Davis, US) [dblp]
- Mark D. Ryan (University of Birmingham, GB) [dblp]
- Peter Y. A. Ryan (University of Luxembourg, LU) [dblp]
- Haya Shulman (TU Darmstadt, DE) [dblp]
- Vanessa Teague (The University of Melbourne, AU) [dblp]
- Vincent Toubiana (CNIL - Paris, FR) [dblp]
- Michael Waidner (TU Darmstadt, DE) [dblp]
- Dan Wallach (Rice University - Houston, US) [dblp]
Classification
- data bases / information retrieval
- security / cryptology
- society / human-computer interaction
Keywords
- big data
- cryptography
- mass surveillance
- privacy
- security
- Snowden
- surveillance