There has been a lively discussion during the last few years about strategies for improving research and publishing practices in science. Beyond the research community itself, the issues of transparency and reproducibility are now priorities at many of the major funders in the U.S. and elsewhere including NIH and NSF, and the White House.
Nearly one year ago, a diverse group of about 50 researchers, journal editors, funders, and society leaders gathered at the Center for Open Science (COS) in Charlottesville, VA to continue the discussion about transparency and reproducibility, and leave with concrete, actionable guidelines. Over 2 days, the group crafted what are now known as the Transparency and Openness Promotion (TOP) Guidelines. The TOP Guidelines provide templates of policies and procedures that journals can adopt to encourage greater transparency and reproducibility of the research they publish. With a few tweaks, funders can adopt the TOP guidelines as policies for the research they sponsor. To date, over 500 journals and 50 organizations have become TOP Guidelines signatories.
This week, Wiley joined as the 51st organizational signatory to the Guidelines. Wiley has become an organizational signatory because so many of its journal and partner societies are taking an interest in improving transparency in general, and in the TOP Guidelines in particular. Before this announcement, 33 journals published by Wiley had already become TOP signatories (you can view, sort, and search the list by journal title, publisher, society, and subject area at cos.io/top).
Both journal and organizational signatories to TOP are expressing their support for the principles of openness, transparency, and reproducibility, with journal signatories committing to conduct a review for potential adoption within a year.
What do the guidelines say?
The guidelines are domain-agnostic, so they can be adopted across disciplines. They’re also modular, covering 8 different components of the research process:
• Data Citation
• Data Transparency
• Analytic Methods (Code) Transparency
• Research Materials Transparency
• Design and Analysis Transparency
• Preregistration of Studies
• Preregistration of Analysis Plans
Each component includes 3 different levels, giving journals the opportunity to adopt part or all of the standards, and select a level of stringency that is most appropriate for them. This simultaneously provides flexibility and offers the benefits of standards across domains. A table summarizing the guidelines is below - Levels 1-3 are included along with Level 0, which is not part of the guidelines but is included as a baseline for comparison. You can read the guidelines in full detail on the TOP website.
Who’s adopting? And at what levels?
The vast majority of the signatories are still in the process of reviewing the Guidelines. While it’s too early to know what levels of TOP the signatories may adopt, we do plan to gather this information as decisions are made. The TOP Committee hopes to facilitate a community of editors who can share their experiences with implementation and offer advice on best practices.
There are some great examples of the guidelines already in practice - journals that have implemented these practices before the TOP Guidelines came to fruition. These journals can be models for others trying to figure out how to implement the Guidelines. There are more journals with policies like these out there - I’m just highlighting a few:
Data, Analytic Methods (Code), and Materials Transparency
The American Journal of Political Science has author guidelines that already match Level 3 of TOP. AJPS requires authors to submit all data, code, and materials to a trusted third-party repository prior to publication, and the analyses are independently reproduced by analysts at the University of North Carolina’s Odum Institute for Research in Social Science. It’s a really interesting process, and you can read more about it in their blog post.
Preregistration of Studies
There are a handful of journals encouraging the preregistration of research studies by offering Badges to Acknowledge Open Practices to their authors, Level 2 of the Guidelines. These journals are Psychological Science, European Journal of Personality, Journal of Social Psychology, and Language Learning.
Level 3 of the Guidelines prescribe using the registered report format of publishing as a submission format for replication studies. Registered reports have the study methodology and analysis plans peer reviewed before the research outcomes are known. Acceptance for publication is based on the importance of the research question and the soundness of the study plan. If the plans pass peer review, the report is granted “in-principle acceptance,” meaning that as long as the researchers follow the plan (or document and justify why they do not), the study will be published regardless of its outcome. This gets at the issue of publication bias. Perspectives on Psychological Science is facilitating and publishing Registered Replication Reports - what’s outlined in Level 3 of the guidelines, with a twist of crowd-sourcing the data collection. Perspectives is publishing registered reports solely in this way, but it’s worth noting that there are nearly 20 journals publishing registered reports for novel work and replications alike.
How can you get involved?
We welcome community involvement! This is just version 1.0 - we expect TOP to evolve. We encourage feedback about what does or doesn’t work in implementation to improve the Guidelines.
If you’re an editor, add your journal to the list of signatories. If you’re an author, ask the editors of journals in which you publish to become signatories. If you’re part of a society or organization that’s in a position to become a signatory - do so! And encourage others to do so, too.
The Transparency and Openness Promotion Committee meeting was organized by COS, the Berkeley Initiative for Transparency in the Social Sciences (BITSS) and Science Magazine, and was funded by the Laura and John Arnold Foundation.
Credit Image: COS | Openness, Integrity, and Reproduciblity