A Brief Review of Open Science Practices

Summary: Are you interested to know more about open science practices? Here is a brief review to get you started.

By van Dijk & Hart, 2020, available under a CC By 4.0 license.

Open Data

Back to top

In a nutshell, Open Data means making all the raw data of a project publicly available. This stands in contrast to presenting only summary statistics (means, standard deviations, correlations) in published manuscripts.

This includes data at the individual participant level that has been cleaned and deidentified, as well as metadata. That is, information about the dataset in a codebook and more general information about the project as a whole.

Many grant funding agencies require data collected by their grantees to be publicly available.

Researchers can choose to share only those variables used in the analyses or share all other variables as well (with the exceptions of personal identifiers).

Openly available datasets are considered permanent products of a research project. Carry a DOI and are citable, giving researchers opportunity to show their impact beyond traditional publications. Publications that are accompanied by open data get cited more often than publications without open data attached (Piwowar & Vision, 2013).

To make replication of the research possible, researchers should also share any code used to analyze data. These codes should include how subsets of the total data were made, and how analysis were run. Besides the syntax, researcher should annotate the code to illuminate why certain decisions were made.

For more information on data curation see: 

Open Documentation

Back to top

Open documentation refers to the practice of making anything necessary to carry out the project that is not data or code available to other researchers.

Sharing materials openly can help other researchers replicate a project, either directly, or with very controlled differences. Replication of research helps to determine its robustness and generalizability.

In many cases, open materials will include study protocols, decision flowcharts, intervention materials, stimuli, blank consent forms, or assessment materials. Any materials that are commercially available should not be shared.

Often project materials are already digitally stored and limited resources need to be expended on sharing.

Preregistration/Registered Reports

Back to top

Preregistration and registered reports go beyond sharing data, code, and materials of a project that has been conducted. In the case of preregistration and registered reports, researchers share the reasoning, design, data collection procedures, and data analytic plan of their study before starting a project (van ‘t Veer & Giner-Sorolla, 2016).

Preregistrations are submitted to a registry and occasionally can receive feedback before plans are carried out. After submission, researcher are able to make changes to their original plans during the design and data collection phases, if unforeseen challenges occurred. Once data is collected, however, changes to the original data analytic plan should be noted in a report. This signals to consumers which analyses were confirmatory (planned) and which were exploratory (unplanned) (Gelhbach & Robinson, 2018).

Registered reports differ slightly from preregistrations. First, authors situate their project within the literature by adding an introduction. Second, the plan is then submitted to a journal and undergoes peer-review. Journals may then decide to give ‘in principle acceptance’. In other words, if researchers carry out the project as they have outlined, the journal will publish the results regardless of the outcomes (Nosek et al, 2019).

Example registries are: Cochrane (https://us.cochrane.org) and PROSPERO (https://www.crd.york.ac.uk/propsero) for systematic reviews and meta-analyses, Society for Research on Educational Effectiveness (https://sreereg.icpsr.umich.edu), Open Science Foundation (www.osf.org), and AsPredicted (www.aspredicted.org) for general studies. These sites have protocols for preregistration of different types of studies.

Open Access

Back to top

Publishing Open Access (OA) helps to make the results of research available to everyone, not just people or institutions with subscriptions to specific journals (Klein et al., 2018; Norris et al., 2008). Open Access articles generally receive more citations.

There are several ways to publish papers Open Access. First, there are specific Open Access journals. Publishing in these journals is often referred to as Gold OA, and all authors pay an article-processing charge. In many cases, journals are not completely OA, but authors can choose to pay the charge to make their article OA; this is often referred to as hybrid OA.

Additionally, authors can post preprints and postprints to repositories and archives; this practice is referred to as Green OA. Preprints are manuscript that are openly shared before undergoing peer-review by a journal. Postprints are versions of the manuscript posted after peer-review that are not formatted according to the journal preferences. Finally, many grant funding agencies mandate papers to be made available through their repositories (e.g., ERIC, PubMedCentral, or NSF-PAR).

Journals have different policies regarding Open Access publishing, and it is recommended authors check before posting their work. Information on journals’ archiving policies and access options can be found on the SHERPA/ROMEO website (http://www.sherpa.ac.uk/romeo/search.php).

Open Science Badges

Back to top

Some journals award badges to authors of manuscripts adhering to one or more of three Open Science principles: Open Data, Open Materials, and Preregistration. (https://osf.io/tvyxz/wiki/1.%20View%20the%20Badges/ )

To qualify for the Open Data badge, authors of a paper need to specify the data are publicly available in a repository, with a unique, persistent identifier, and a timestamp. The data should be accompanied by a codebook and have a license that allows for others to copy, distribute, and use the data while at the same time keeping credit and copyright with the authors.

For Open Materials, authors should specify that all materials needed to reproduce the results are openly available. Just as Open Data, materials need an identifier, timestamp, and license. Authors should also make clear how each of the materials are related to the study and its analyses.

Finally, authors can be awarded a badge for preregistering their study. The registration should be timestamped and submitted in an institutional registration system. The registration should be stamped before intervention or conduction of the study. Finally, the design in the manuscript should correspond to the registration, or detail how and why it differed, as well as report all results of hypotheses posed in the registration.

The Open Science Foundation keeps a list of journals awarding badges (https://www.cos.io/our-services/badges).

Recommended Reading and References

Back to top

Cook, B. G. (2016). Reforms in academic publishing: Should behavioral disorders and special education journals embrace them? Behavioral Disorders, 41(3), 161–172. https://doi.org/10.17988/0198-7429-41.3.161

Coyne, M. D., Cook, B. G., & Therrien, W. J. (2016). Recommendations for replication research in special education: A framework of systematic, conceptual replications. Remedial and Special Education, 37(4), 244–253. https://doi.org/10.1177/0741932516648463

Gehlbach, H., & Robinson, C. D. (2018). Mitigating illusory results through preregistration in education. Journal of Research on Educational Effectiveness, 11(2), 296–315. https://doi.org/10.1080/19345747.2017.1387950

Grahe, J. (2018). Another step towards scientific transparency: Requiring research materials for publication. The Journal of Social Psychology, 158(1), 1–6. https://doi.org/10.1080/00224545.2018.1416272

Harnad, S., Brody, T., Vallières, F., Carr, L., Hitchcock, S., Gingras, Y., Oppenheim, C., Stamerjohanns, H., & Hilf, E. R. (2004). The access/impact problem and the green and gold roads to open access. Serials Review, 30(4), 310–314.

Klein, O., Hardwicke, T. E., Aust, F., Breuer, J., Danielsson, H., Mohr, A. H., Ijzerman, H., Nilsonne, G., Vanpaemel, W., & Frank, M. C. (2018). A practical guide for transparency in psychological science. Collabra: Psychology, 4(1), 20. https://doi.org/10.1525/collabra.158

Norris, M., Oppenheim, C., & Rowland, F. (2008). The citation advantage of open-access articles. Journal of the American Society for Information Science and Technology, 59(12), 1963–1972. https://doi.org/10.1002/asi.20898

Nosek, B. A., Beck, E. D., Campbell, L., Flake, J. K., Hardwicke, T. E., Mellor, D. T., van ’t Veer, A. E., & Vazire, S. (2019). Preregistration is hard, and worthwhile [Preprint]. PsyArXiv. https://doi.org/10.31234/osf.io/wu3vs

Nosek, B. A., Spies, J. R., & Motyl, M. (2012). Scientific Utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7(6), 615– 631.

Van den Akker, O., Weston, S. J., Campbell, L., Chopik, W. J., Damian, R. I., Davis-Kean, P., Hall, A. N., Kosie, J. E., Kruse, E. T., Olsen, J., Ritchie, S. J., Valentine, K. D., van ’t Veer, A. E., & Bakker, M. (2019). Preregistration of secondary data analysis: A template and tutorial [Preprint]. PsyArXiv. https://doi.org/10.31234/osf.io/hvfmr

van der Zee, T., & Reich, J. (2018). Open education science. AERA Open, 4(3), 1–15. https://doi.org/10.1177/2332858418787466

van Dijk, W., Schatschneider, C., & Hart, S. A. (2020). Open Science in Education Sciences. Journal of Learning Disabilities, advance online publication. https://doi.org/10.1177/0022219420945267

van ’t Veer, A. E., & Giner-Sorolla, R. (2016). Pre-registration in social psychology—A discussion and suggested template. Journal of Experimental Social Psychology, 67, 2–12. https://doi.org/10.1016/j.jesp.2016.03.004

Wilkinson, M. D., Dumontier, M., Aalbersberg, Ij. J., Appleton, G., Axton, M., Baak, A., Blomberg, N., Boiten, J.-W., da Silva Santos, L. B., Bourne, P. E., Bouwman, J., Brookes, A. J., Clark, T., Crosas, M., Dillo, I., Dumon, O., Edmunds, S., Evelo, C. T., Finkers, R., … Mons, B. (2016). The FAIR Guiding Principles for scientific data management and stewardship. Scientific Data, 3(1), 160018. https://doi.org/10.1038/sdata.2016.18

https://www.dtls.nl/fair-data/fair-data/