Dagstuhl Seminar 22392
Transparent Quantitative Research as a User Interface Problem
( Sep 25 – Sep 30, 2022 )
Permalink
Organizers
- Kasper Hornbaek (University of Copenhagen, DK)
- Yvonne Jansen (CNRS - Talence, FR)
- Amelia A. McNamara (University of St. Thomas - St. Paul, US)
- Judy Robertson (University of Edinburgh, GB)
- Chat Wacharamanotham (Swansea University, GB)
Contact
- Andreas Dolzmann (for scientific matters)
- Jutka Gasiorowski (for administrative matters)
Schedule
Many scientific fields face a replication crisis: A sizable portion of quantitative research studies could not be replicated. When these studies were re-run with higher statistical power (i.e., more participants), their results yielded effects substantially weaker or even opposite of that in the original studies. This lack of replicability threatens the credibility of research claims and undermines the general public's trust in science. The replication crisis motivated the Open Science movement that promotes transparency throughout the scientific process: research funding, research design, data collection and analysis, peer reviewing, and knowledge dissemination. These phenomena attracted the interest of researchers in the fields of Human--Computer Interaction (HCI) and Visualization (VIS) for two reasons. Like other fields, HCI and VIS researchers face challenges in promoting transparency among their peers, effectively implementing and educating transparent practices, and incorporating transparency in the research evaluation processes. However, HCI and VIS researchers have the methods and skills to empirically study these phenomena and design potential solutions. The fields of HCI and VIS also provide a challenging testbed for these inventions.
This Dagstuhl seminar initiated and advanced works on these issues by bringing together 23 researchers from HCI, VIS, statistics, psychology, data science, and philosophy. They were from Australia, Austria, Canada, Denmark, Finland, France, Germany, the Netherlands, Sweden, Switzerland, the UK, and the USA. Three participants joined online due to the COVID situation and travel difficulties.
Program
We worked in groups to identify problem areas and prototype potential solutions in a Hackathon. We solicited feedback on these prototypes from conference and journal editors and community leaders. The seminar unfolded as follows:
Day 1: After a brief introduction to the purpose of the seminar and the overall plan, participants discussed in small groups to identify problems and challenges to work on in the Hackathon. These discussions were intentionally designed to be free-form to avoid prematurely limiting the areas of interest. To stimulate discussions and spark ideas, we provided the participants access to free-text responses to a survey on the perception of research transparency that we collected from HCI researchers in the weeks before the seminar. Four rounds of discussion were interleaved with three-minute presentations of intermediate results in the plenary to facilitate convergence and consolidation.
In each plenary round, we also asked a few participants to interview each other in front of the room to acquaint everyone with their background and research interest. Day 1 concluded with four clusters of topics to be worked on: (1) Educating researchers, (2) Clarifying the threats from the lack of transparency, (3) Clarifying the "transparency" concept, and (4) Working on how to influence policy and procedures in the publication process.
Day 2: Participants joined the problem cluster according to their interests and started the Hackathon. We provided each group with collaborative workspaces on Google Docs and Miro (an online whiteboard platform). After two Hackathon sessions in the morning, we further stimulated their work with an input lecture from Tim Errington, the Senior Director of Research at the Center for Open Science (see below for an abstract). This lecture highlighted challenges in promoting research transparency and provided a framework for changing research culture at multiple levels: from top-down research funding policy and bottom-up to ease the implementation of transparent practices by providing infrastructure and incentives. After the lecture, the Hackathon continued. We wrapped up the day with a 3-minute presentation from each group and a plenary discussion.
Day 3: The Hackathon continued in the morning. We gave the participants prompts to encourage them to hone in on a concrete idea and realize a prototype that demonstrates the idea's essence. The afternoon is free time for the participants to self-organize group activities to promote trust and informal interactions. We did not organize an excursion because the transportation companies were unavailable.
Day 4: The Hackathon continued in the morning. In the afternoon, the participants presented their preliminary results to four panelists who joined online. The panelists hold influential positions in the research publication process in HCI and VIS: the SIGCHI President, the TOCHI editor-in-chief, the TVCG Associate Editor in Chief and Eurographics Publication Board, CG&A Associate Editor-in-chief, TVCG Associate Editor, and the vice-chair of the IEEE VIS Steering Committee. A discussion on feedback from policy-making perspectives followed each presentation. The conversation with the panelists broadened participants' views about stakeholders and potential concerns. After the discussion, there was a plenary discussion to process the input from the panel collectively. We identified four areas to work on in the manifesto: definition, benefits, subfield-suitability, and progressive transparency.
Day 5: Participants worked in groups to draft a manifesto on research transparency. The seminar concluded with a plenary session where we identified possible future projects, their follow-up actions, and coordinators.
Many research fields are currently rethinking their research methods towards more transparent practices. The most rapid progress towards transparency can be observed in fields that were heavily affected by a replication crisis, like psychology, while changes are slower and receive more resistance in interdisciplinary fields, such as human-computer interaction (HCI) and visualization (VIS). In this Dagstuhl Seminar, we want to address the issue of hesitant adoption of transparent research methods by framing it as a user interface problem: the 'interface' of using transparent methods is ill-adapted and needs to better respond to the needs and concerns of researchers and other stakeholders in research (e.g., study participants, journal editors, institutional review board).
This Dagstuhl Seminar will bring together a diverse group of junior and senior researchers from HCI and VIS as well as neighboring fields concerned with transparent quantitative research approaches who together are in a unique position: they can (1) conduct inquiries into problems and barriers in the human aspect of transparent research and (2) contribute to designing and developing potential solutions to support transparent research practices — both within their fields and beyond. The seminar will be guided by survey data collected prior to the seminar to enable a data-driven approach and organized around hackathon activities, guest speakers, and comments from policymakers. The seminar's intended outcome is a synthesis of the collected data, a research agenda, a manifesto of transparent quantitative research practices, and the development of future research collaborations.
We expect these activities to stimulate discussion on questions such as the following: How can we lower technical barriers for authors to share their research materials and preregister data analysis plans? How to encourage teachers of research methods and statistics courses to change their material to prioritize clear statistical communication and associated transparent research practices? How can we quantify the "transparency" of statistical practices in a given field? How could policies balance data privacy while also providing for more transparency?
Expected outcome
- Define a transdisciplinary research agenda for improving transparent quantitative research, including the development of evaluation methods, empirical studies, and new technologies.
- Produce a manifesto for transparent quantitative research practices to be promoted by seminar attendees through their networks, and shared with journal editors and conference chairs by the organizers to increase adoption of practices in transparent quantitative research.
- Developsynergies and networks on this research theme.
- Lonni Besancon (Linköping University, SE) [dblp]
- Sophia Crüwell (University of Cambridge, GB)
- Pierre Dragicevic (INRIA - Bordeaux, FR) [dblp]
- Julien Gori (Sorbonne University - Paris, FR)
- Lahari Goswami (University of Lausanne, CH)
- Lynda Hardman (CWI - Amsterdam, NL & Utrecht University, NL) [dblp]
- Olga Iarygina (University of Copenhagen, DK)
- Yvonne Jansen (CNRS - Talence, FR) [dblp]
- Eunice Jun (University of Washington - Seattle, US)
- Ulrik Lyngs (University of Oxford, GB)
- Amelia A. McNamara (University of St. Thomas - St. Paul, US) [dblp]
- Duong Nhu (Monash University - Clayton, AU)
- Viktorija Paneva (Universität Bayreuth, DE)
- Michael Sedlmair (Universität Stuttgart, DE) [dblp]
- Kavous Selahzadeh Niksirat (University of Lausanne, CH)
- Theophanis Tsandilas (Université Paris-Saclay, Orsay, FR & Inria, Orsay, FR)
- Jan Benjamin Vornhagen (Aalto University, FI)
- Chat Wacharamanotham (Swansea University, GB)
- Erich Weichselgartner (Universität Graz, AT) [dblp]
- Wesley J. Willett (University of Calgary, CA) [dblp]
Classification
- Human-Computer Interaction
Keywords
- Open Science
- Transparency
- Reproducibility
- Statistics