Files in this item



application/pdfWILLIS-DISSERTATION-2020.pdf (2MB)
(no description provided)PDF


Title:Trust, but verify: An investigation of methods of verification and dissemination of computational research artifacts for transparency and reproducibility
Author(s):Willis, Craig Alan
Director of Research:Stodden, Victoria
Doctoral Committee Chair(s):Stodden, Victoria
Doctoral Committee Member(s):Darch, Peter; Ludaescher, Bertram; Taufer, Michela
Department / Program:Information Sciences
Discipline:Library & Information Science
Degree Granting Institution:University of Illinois at Urbana-Champaign
Subject(s):Computational reproducibility
Peer review
Abstract:In this study, I investigate how research communities are addressing concerns about the quality and rigor of computational research. I focus specifically on initiatives to expand the peer review and publication process to include new requirements for the assessment and dissemination of computational research artifacts. I report the results of a multiple-case analysis of two primary (American Journal of Political Science, ACM/IEEE Supercomputing) and five supplemental cases in political science, computer science, economics, mathematics, and statistics. Cases were developed through qualitative analysis based on interviews with key stakeholders including editors, reviewers, and verifiers; a sample of verified artifacts; and documentary evidence including policies, guidelines, and workflows. The central argument of this dissertation is that these reproducibility initiatives represent a set of experiments across the sciences exploring how changes to the incentives and information requirements of authors impact the quality, rigor, reproducibility, and trustworthiness of published research. These initiatives are part of a broader effort to change community norms with respect to the dissemination of the results of research that involves computation, elevating the importance of computational artifacts and clearly signaling that authors cannot be trusted to provide this information voluntarily. Expanding peer review and increasing the information required of authors through publication reproducibility audits is just one approach -- and a costly one -- to improving research quality and trustworthiness. The effect of these changes on research quality has yet to be demonstrated or studied. Based on the cases, I identify key factors that influence the operationalization of policies and workflows; the elements that each community considers important to the assessment of computational transparency and reproducibility; as well as the tools and infrastructure that they leverage to aid in the creation, assessment and dissemination of reproducible research artifacts. I develop a framework to analyze the reproducibility initiatives and a conceptual model of reproducible research artifacts. I relate my findings to recommendations from the recent National Academies of Science, Engineering and Medicine (NASEM) report on \emph{Reproducibility and Replicability in Science} and provide a set of normative guidelines for communities interested in pursuing similar initiatives with implications for journal and conference leadership; tool and infrastructure developers; and funding bodies. I conclude that, while promising, further efforts should be made to increase our understanding of the effect of initiative policies and technological advancements on research quality.
Issue Date:2020-07-13
Rights Information:Copyright 2020 Craig Willis
Date Available in IDEALS:2020-10-07
Date Deposited:2020-08

This item appears in the following Collection(s)

Item Statistics