Reproducible Research Practices at ISPS

Authored By 
Limor Peer
Blog contributor 
ISPS Team
April 30, 2018

A recent event at the Yale School of Public Health brought into sharp focus the issue of improving reproducible research practices (see article summarizing the event). I am grateful for the opportunity to describe our approach at ISPS in this forum. In this brief post, I aim to clarify what we mean by “reproducible research practices” (RRP) at the ISPS Data Archive.

While there is agreement that RRP are desirable, there are lingering questions about what they are. We can think of “reproducible research practices” as containing two component parts, and it is important to differentiate them. We might refer to the first component as “Scientific RRP,” and it includes practices and decisions relating to carrying out the research – from design and planning to execution and analysis. Reproducible practices focus on the quality of the research, and in that sense scientific RRP stands in contrast to “questionable research practices” such as misuse of statistical methods (e.g., p-hacking). John Ioannidis puts forth ideas for improving scientific RRP including creating a reward system that places more emphasis on investigators conducting rigorous studies, expecting defensible design and conduct standards, conducting more replication studies, fostering a well trained methodological research workforce, continuing professional development, systematic integration of evidence in systematic reviews, and involvement of non-conflicted stakeholders.

Another category of RRP includes practices that relate to reporting about how the research was carried out and sharing all the materials that were produced along the way. We might refer to this as “Transparent RRP,” and these are necessary to evaluate the scientific RRP. Here the focus is on opening up research for others to see and evaluate what you did. This reflects norms and values of open science. Improving transparent RRP includes more and better access to raw data, code, protocols, and other materials associated with the research including, as Ioannidis has argued, documentation of funding and potential conflicts of interest.

Clearly, the two RRP components are linked – scientific rigor can be evaluated when research materials and process are transparent – and it is reasonable to expect that researchers’ performance on both will track together. In fact, as technological conditions, policy framework, and community expectation evolve and converge, we can expect the emergence of workflows that will enable high quality Scientific and Transparent RRP at the same time. My point here is that currently the two components of RRP often require different skills, practices, and tools. In my talk at the YSPH event, I focused on Transparent RRP.

Transparent RRP require specific skills in order to make the entire research lifecycle as open as possible, while also abiding by any restrictions on the data or methods. This includes knowledge of how to properly document research data, and how to make available research protocols and any scripts and code used to create, clean, combine, or statistically analyze the data. Especially when it comes to disciplines with less centralized infrastructure or common practices, there is a need for expertise on the legal aspects of data sharing, risk assessment, and preservation strategies. Increasingly librarians and other information and data professionals are trained in these areas (see Christine Borgman; some have argued there is a suite of data science roles necessary to support scientists and researchers and, therefore, a need for “re-engineering data education, training, and skills production to keep pace with market demands for data talent,” see Liz Lyon). Scientists and researchers may not have, or may not be interested in having, these skills and roles.

ISPS produces research that breaks methodological and theoretical boundaries, illuminating new frontiers of knowledge and setting standards for rigor in the evaluation of programs and policies. Many current and former ISPS affiliates have also been recognized for their commitment to transparency and reproducibility in science (for example, Allan Dafoe and Peter Aronow, winners of the Leamer-Rosenthal Prizes for Open Social Science). In 2010 ISPS took a stand to encourage openness in research and launched the ISPS Data Archive as a “means for making the research more open and accessible.”

The ISPS Data Archive works alongside researchers by assisting with and advocating for practices that aim to increase Transparent RRP. ISPS has created a model based on the belief that it has both responsibility and expertise to assist researchers who wish to disseminate and archive research products that support published scientific claims and has created a process to ensure these products are of high quality. The ISPS process, which aligns data curation with quality review, has been influential and informed the development of similar practices in other social science data archives who recently joined together under a consortium called Curating for Reproducibility (CURE). In 2017, CURE received a grant from the Institute of Museum and Library Services (IMLS) Laura Bush 21st Century Librarian Program to define the necessary skills for this type of work and to develop an evidence-based training program focused on data curation for reproducibility for librarians and archivists.

Here are some resources for researchers who would like to learn more about Transparent RRP:

How to do it:

Get training:

  • Center for Open Science (COS) Open and Reproducible Practices Workshop
  • Berkeley Initiative for Transparency in the Social Sciences (BITSS) Research Transparency and Reproducibility Trainings (RT2)
  • Inter University Consortium for Political and Social Research (ICPSR) Summer Program in Quantitative Methods of Social Research
  • Project TIER Teaching Integrity in Empirical Research Protocol
  • NIH Rigor & Reproducibility video modules
  • Online short course
    • European Geosciences Union General Assembly (EGUGA) short course on Writing reproducible geoscience papers using R Markdown, Docker, and GitLab

Teach it:

Get involved:

Journals

  • Data and code sharing policies
  • Transparency and Openness Practices (TOP) guidelines
  • The American Journal of Political Science (AJPS) Replication and Verification Policy

Academic societies

  • e.g., APSA Data Access and Research Transparency (DA-RT)

Academic centers

  • Prizes, e.g., BITSS Leamer-Rosenthal Prize

Community enforcement

Repositories

Resources updated May 9, 2018