10 References

Allaire, J. J., Cheng, J., Xie, Y., McPherson, J., Chang, W., Allen, J., … Arslan, R. C. (2017a). rmarkdown: Dynamic Documents for R [Computer software]. Retrieved from https://cran.r-project.org/web/packages/rmarkdown

Allaire, J. J., R Foundation, Wickham, H., Journal of Statistical Software, Xie, Y., Vaidyanathan R., … Yu, M. (2017b). rticles: Article Formats for R Markdown [Computer software]. Retrieved from https://cran.r-project.org/web/packages/rticles

American Psychological Association. (2010). Publication Manual of the American Psychological Association (6th edition). Washington, DC: American Psychological Association.

Arslan, R. C. (2018). Automatic codebook generation with {codebook}. Retrieved from https://github.com/rubenarslan/codebook

Aust, F. (2016). citr: ‘RStudio’ Add-in to Insert Markdown Citations [Computer software]. Retrieved from https://cran.r-project.org/web/packages/citr

Aust, F., & Barth, M. (2017). papaja: Create APA manuscripts with R Markdown [Computer software]. Retrieved from https://github.com/crsh/papaja

Boettiger, C. (2015). An introduction to Docker for reproducible research. ACM SIGOPS Operating Systems Review, 49(1), 71-79. DOI: http://www.doi.org/ 10.1145/2723872.2723882.

Chambers, C. D. (2013). Registered Reports: A new publishing initiative at Cortex. Cortex, 49(3), 609–610. DOI:  https://doi.org/10.1016/j.cortex.2012.12.016

Chambers, C. D., Feredoes, E., Muthukumaraswamy, S. D., & Etchells, P. J. (2014). Instead of “playing the game” it is time to change the rules: Registered Reports at AIMS Neuroscience and beyond. AIMS Neuroscience, 1(1), 4–17. DOI:  https://doi.org/10.3934/Neuroscience.2014.1.4

Chassang, G. (2017). The impact of the EU general data protection regulation on scientific research. Ecancermedicalscience, 11: 709 . DOI:  http://www.doi.org/ 10.3332/ecancer.2017.709

Chavan, V., & Penev, L. (2011). The data paper: a mechanism to incentivize data publishing in biodiversity science. BMC Bioinformatics, 12, Suppl 15, S2. DOI : https://doi.org/10.1186/1471-2105-12-S15-S2

Chirigati, F., Rampin, R., Shasha, D., & Freire, J. (2016). Reprozip: Computational reproducibility with ease. In Proceedings of the 2016 International Conference on Management of Data (pp. 2085-2088). ACM.

Donoho, D. L. (2010). An invitation to reproducible computational research. Biostatistics, 11(3), 385–388. https://doi.org/10.1093/biostatistics/kxq028

Eglen, S. J., Marwick, B., Halchenko, Y. O., Hanke, M., Sufi, S., Gleeson, P., … Poline, J.-B. (2017). Toward standard practices for sharing computer code and programs in neuroscience. Nature Neuroscience, 20(6), 770–773. DOI: https://doi.org/10.1038/nn.4550

Eubank, N. (2016). Lessons from a Decade of Replications at the Quarterly Journal of Political Science. PS: Political Science & Politics, 49(2), 273–276. DOI: https://doi.org/10.1017/S1049096516000196

El Emam, K. (2013). Guide to the de-identification of personal health information. Boca Raton, FL: CRC Press.

Gandrud, C. (2013a). Reproducible Research with R and Rstudio. Boca Raton, FL: CRC Press. https://github.com/christophergandrud/Rep-Res-Book

Gandrud, C. (2013b). GitHub: A tool for social data set development and verification in the cloud. Available at SSRN: https://ssrn.com/abstract=2199367 or https://doi.org/10.2139/ssrn.2199367

Glatard, T., Lewis, L. B., Ferreira da Silva, R., Adalat, R., Beck, N., Lepage, C., … Evans, A. C. (2015). Reproducibility of neuroimaging analyses across operating systems. Frontiers in Neuroinformatics, 9. DOI: https://doi.org/10.3389/fninf.2015.00012

Gorgolewski, K. J., Margulies, D. S., & Milham, M. P. (2013). Making data sharing count: a publication-based solution. Frontiers in Neuroscience, 7, 9.

Gronenschild, E. H. B. M., Habets, P., Jacobs, H. I. L., Mengelers, R., Rozendaal, N., Os, J. van, & Marcelis, M. (2012). The Effects of FreeSurfer Version, Workstation Type, and Macintosh Operating System Version on Anatomical Volume and Cortical Thickness Measurements. PLOS ONE, 7(6), e38234. DOI: https://doi.org/10.1371/journal.pone.0038234

Hardwicke, T. E., Mathur, M. B., MacDonald, K. E., Nilsonne, G., Banks, G. C.,… Frank, M. C. (2018, March 19). Data availability, reusability, and analytic reproducibility: Evaluating the impact of a mandatory open data policy at the journal Cognition. Retrieved from https://osf.io/preprints/bitss/39cfb/

Houtkoop, B., Chambers, C., Macleod, M., Bishop, D., Nichols, T., & Wagenmakers, E. J. Data sharing in psychology: A survey on barriers and preconditions. Advances in Methods and Practices in Psychological Science. https://doi.org/10.1177/2515245917751886

Huff, K. (2017). Lessons Learned. In Kitzes, J., Turek, D., & Deniz, F. (Eds.). The Practice of Reproducible Research: Case Studies and Lessons from the Data-Intensive Sciences. Oakland, CA: University of California Press. Retrieved from https://www.gitbook.com/book/bids/the-practice-of-reproducible-research

International Consortium of Investigators for Fairness in Trial Data Sharing, Devereaux, P. J., Guyatt, G., Gerstein, H., Connolly, S., & Yusuf, S. (2016). Toward Fairness in Data Sharing. The New England Journal of Medicine, 375(5), 405–407. DOI: https://doi.org/10.1056/NEJMp1605654

Ince, D. C., Hatton, L., & Graham-Cumming, J. (2012). The case for open computer programs. Nature, 482(7386), 485–488. DOI: https://doi.org/10.1038/nature10836

Keeling, K. B., & Pavur, R. J. (2007). A comparative study of the reliability of nine statistical software packages. Computational Statistics & Data Analysis, 51(8), 3811–3831. DOI: https://doi.org/10.1016/j.csda.2006.02.013

Kernighan, B. W., & Plauger, P. J. (1978). The Elements of Programming Style (2nd edition). New York: McGraw-Hill.

Kidwell, M. C., Lazarević, L. B., Baranski, E., Hardwicke, T. E., Piechowski, S., Falkenberg, L.-S., … Nosek, B. A. (2016). Badges to Acknowledge Open Practices: A Simple, Low-Cost, Effective Method for Increasing Transparency. PLoS Biology, 14(5), e1002456. DOI: https://doi.org/10.1371/journal.pbio.1002456.

Kitzes, K. (2017). The Basic Reproducible Workflow Template. In Kitzes, J., Turek, D., & Deniz, F. (Eds.). The Practice of Reproducible Research: Case Studies and Lessons from the Data-Intensive Sciences. Oakland, CA: University of California Press. Retrieved from https://www.gitbook.com/book/bids/the-practice-of-reproducible-research

Kluyver, T., Ragan-Kelley, B., Pérez, F., Granger, B., Bussonnier, M., Frederic, J., … Jupyter Development Team. (2016). Jupyter Notebooks – A publishing format for reproducible computational workflows. In Proceedings of the 20th International Conference on Electronic Publishing (pp. 87–90). DOI: https://doi.org/10.3233/978-1-61499-649-1-87

Knuth, D. E. (1984). Literate Programming. The Computer Journal, 27(2), 97–111. DOI: https://doi.org/10.1093/comjnl/27.2.97

Leeper, T. J. (2014). Archiving Reproducible Research with R and Dataverse. R Journal, 6(1).

Lo, B., & DeMets, D. L. (2016). Incentives for Clinical Trialists to Share Data. The New England Journal of Medicine, 375(12), 1112–1115. DOI: https://doi.org/10.1056/NEJMp1608351

Long, J. S. (2009). The workflow of data analysis using Stata. College Station, TX: Stata Press.

Lin, W., & Green, D. P. (2016). Standard Operating Procedures: A Safety Net for Pre-Analysis Plans. PS: Political Science & Politics, 49(3), 495–500. DOI: https://doi.org/10.1017/S1049096516000810

Martin, R. C. (2009). Clean Code - A Handbook of Agile Software Craftsmanship. Upper Saddle River, NJ: Prentice Hall. Retrieved from http://ricardogeek.com/docs/clean_code.html

Meyer, M. N. (2018). Practical tips for ethical data sharing. Advances in Methods and Practices in Psychological Science. Advance online publication. DOI: https://doi.org/10.1177/2515245917747656

Morey, R. D., Chambers, C. D., Etchells, P. J., Harris, C. R., Hoekstra, R., Lakens, D., et al. (2016). The peer reviewers’ openness initiative: Incentivizing open research practices through peer review. Royal Society Open Science, 3(1), 150547–7. DOI: https://doi.org/10.1098/rsos.150547

Morin, A., Urban, J., Adams, P. D., Foster, I., Sali, A., Baker, D., & Sliz, P. (2012). Shining Light into Black Boxes. Science, 336(6078), 159–160. DOI:  https://doi.org/10.1126/science.1218263

Piccolo, S. R., & Frampton, M. B. (2016). Tools and techniques for computational reproducibility. GigaScience, 5, 30. DOI:  https://doi.org/10.1186/s13742-016-0135-4

Staneva, V. (2017). Assessing Reproducibility. In Kitzes, J., Turek, D., & Deniz, F. (Eds.). The Practice of Reproducible Research: Case Studies and Lessons from the Data-Intensive Sciences. Oakland, CA: University of California Press. Retrieved from https://www.gitbook.com/book/bids/the-practice-of-reproducible-research

Rowhani-Farid, A., Allen, M., & Barnett, A. G. (2017). What incentives increase data sharing in health and medical research? A systematic review. Research Integrity and Peer Review, 2(1), 4. DOI: https://doi.org/10.1186/s41073-017-0028-9

Rowhani-Farid, A., & Barnett, A. G. (2018). Badges for sharing data and code at Biostatistics: an observational study. F1000Research, 7, 90. https://doi.org/10.12688/f1000research.13477.1

Rouder, J. N. (2016). The what, why, and how of born-open data. Behavior research methods, 48(3), 1062-1069. DOI: https://doi.org/10.3758/s13428-015-0630-z

Sandve, G. K., Nekrutenko, A., Taylor, J., & Hovig, E. (2013). Ten Simple Rules for Reproducible Computational Research. PLOS Computational Biology, 9(10), e1003285. https://doi.org/10.1371/journal.pcbi.1003285

Simonsohn, U. (2015). Small telescopes: Detectability and the evaluation of replication results. Psychological science, 26(5), 559-569. DOI: http://www.doi.org/10.1126/science.aab2374

Smith, R., & Roberts, I. (2016). Time for sharing data to become routine: the seven excuses for not doing so are all invalid. F1000Research, 5, 781. DOI: https://doi.org/10.12688/f1000research.8422.1

Steegen, S., Tuerlinckx, F., Gelman, A., & Vanpaemel, W. (2016). Increasing transparency through a multiverse analysis. Perspectives on Psychological Science, 11, 702-712. DOI: https://doi.org/10.1177/1745691616658637

Stodden, V., & Miguez, S. (2014). Best Practices for Computational Science: Software Infrastructure and Environments for Reproducible and Extensible Research. Journal of Open Research Software, 2(1). DOI: https://doi.org/10.5334/jors.ay

Sweeney L. (2000).  Simple demographics often identify people uniquely. http://impcenter.org/wp-content /uploads/2013/09/Simple-Demographics-Often -Identify-People-Uniquely.pdf.

Thabane, L., Mbuagbaw, L., Zhang, S., Samaan, Z., Marcucci, M., Ye, C., et al. (2013). A tutorial on sensitivity analyses in clinical trials: the what, why, when and how. BMC Medical Research Methodology, 13(1), 92. DOI:  https://doi.org/10.1186/1471-2288-13-92

van ’t Veer, A. E., & Giner-Sorolla, R. (2016). Pre-registration in social psychology—A discussion and suggested template. Journal of Experimental Social Psychology, 67(C), 2–12. DOI:  https://doi.org/10.1016/j.jesp.2016.03.004

Vihinen, M. (2015). No more hidden solutions in bioinformatics. Nature News, 521(7552), 261. DOI: https://doi.org/10.1038/521261a

Welty, L.J., Rasmussen, L.V., Baldridge, A.S., & Whitley, E. (2016). StatTag [Computer software]. Chicago, IL: Galter Health Sciences Library. DOI: https://doi.org/10.18131/G36K76

Xie, Y. (2015). Dynamic Documents with R and knitr (2nd edition). Boca Raton, FL: CRC Press.


  1. The terms “anonymize” and “de-identify” are used differently in various privacy laws, but typically refer to the process of minimizing risk of re-identification using current best statistical practices (El Emam, 2013).↩︎

  2. It should therefore be distinguished from a practice sometimes called “pseudo-anonymization”, which involves only partial anonymization (e.g., “Michael Johnson” becoming “Michael J.”).↩︎

  3. Alternatively, the same workflow could also be set up with other cloud storage providers that are integrated with OSF, such as ownCloud, Google Drive, or Box, though funders or institutions may have restrictions on the use of these providers.↩︎

  4. Codebooks can be generated from the data set metadata in popular statistical software, including SPSS (http://libguides.library.kent.edu/SPSS/Codebooks), Stata (http://www.stata.com/manuals13/dcodebook.pdf), or R (http://www.martin-elff.net/knitr/memisc/codebook.html; https://cran.r-project.org/web/packages/codebook/index.html), or with data publishing tools (e.g., http://www.nesstar.com/software/publisher.html). The author of the codebook (Arslan, 2018) package for R has also created a web app for creating codebooks for SPSS, Stata or RDS files: https://rubenarslan.ocpu.io/codebook/www/↩︎

  5. For those interested in metadata and codebooks, the Digital Curation Centre provides a helpful overview (see http://www.dcc.ac.uk/resources/metadata-standards). Common metadata standards are the basic and general-purpose Dublin Core and the more social-science-focused Data Documentation Initiative (DDI) that was originally developed for survey data (Leeper, 2014).↩︎

  6. such as those at Ludwig Maximilian University (Munich, Germany) and at the University of Oregon.↩︎

  7. An example of such a statement is “such as “Our department embraces the values of open science and strives for replicable and reproducible research. For this goal we support transparent research with open data, open materials, and study pre-registration. Candidates are asked to describe in what way they already pursued and plan to pursue these goals.” (http://www.fak11.lmu.de/dep\_psychologie/osc/open-science-hiring-policy/index.html).↩︎


The contents of this website is licensed under . When refering to this website please cite
Klein, O., Hardwicke, T. E., Aust, F., Breuer, J., Danielsson, H., Hofelich Mohr, A., … Frank, M. C. (2018). A Practical Guide for Transparency in Psychological Science. Collabra: Psychology, 4(1), 20. https://doi.org/10.1525/collabra.158