9 Incentivising Sharing
When sharing data, code, and materials, when reusing resources shared by others, and when appraising research merits, scientists form part of an ecosystem where behaviour is guided by incentives. Scientists can help shape these incentives and promote sharing by making use of mechanisms to assign credit, and by recognizing the value of open resources published by others.
9.1 How to get credit for sharing
To make a published dataset citable, it is recommended to use a repository that provides a persistent identifier, such as a Digital Object Identifier (DOI). Others will then be able to cite the data set unambiguously.
A further mechanism that can help a researcher get credit for open data is the data article.The purpose of a data article is to describe an dataset in detail, thereby increasing the potential for reuse (Gorgolewski, Margulies, & Milham, 2013). Examples of journals that publish data articles and cover the field of psychology are Scientific Data (https://www.nature.com/sdata/), the Journal of Open Psychology Data (https://openpsychologydata.metajnl.com/), and the Research Data Journal for the Humanities and Social Sciences (http://www.brill.com/products/online-resources/research-data-journal-humanities-and-social-sciences). Data articles can be used to provide documentation going beyond metadata in a repository, e.g. by including technical validation. They can be a good means of enhancing the visibility and reusability of the data and are especially worthwhile for data with high reuse potential.
9.2 Initiatives to increase data sharing
Numerous research funders, universities/institutions, and scientific journals have adopted policies encouraging or mandating open data (reviewed e.g. in Chavan & Penev, 2011 and Houtkoop, Chambers, Macleod, Bishop, Nichols, & Wagenmakers, 2018). The Peer Reviewers’ Openness (PRO) Initiative is seeking to encourage transparent reporting of data and materials availability via the peer review process (Morey et al., 2016). Signatories of the PRO Initiative commit to reviewing papers only if the authors either make the data and materials publically available, or explain in the manuscript why they chose not to share the data and materials.
A recent systematic review (Rowhani-Farid, Allen, & Barnett, 2017) found that only one incentive has been tested in health and medical research with data sharing as outcome measure: Badges to Acknowledge Open Practices (https://osf.io/tvyxz/wiki/home/). Kidwell et al. (2016) observed an almost 10-fold increase in data sharing after badges were introduced at the journal Psychological Science. However because this was an observational study, it is possible that other factors contributed to this trend. A follow-up study of badges at the journal Biostatistics found a more modest increase by about 7% on an absolute scale (Rowhani-Farid & Barnett, 2018).
Another strategy for incentivizing sharing comes from fellowships funding the expansion of transparent research practices in academic institutions, such as the rOpenSci fellowship program and the Mozilla Science Fellowship program.
9.3 Reusing others’ research products
Citation of research products – software, data, and materials, not just papers – contributes to better incentives for sharing these products. Commonly cited barriers to data sharing include concerns of researchers who generate data that others will publish important findings based on these data before they do (“scooping”), duplication of efforts leading to inefficient use of resources, and that new analyses will lead to unpredictable and contradictory results (International Consortium of Investigators for Fairness in Trial Data Sharing et al., 2016; Smith & Roberts, 2016). While, at least to our knowledge, there exists no reported case of a scientist that has been scooped with their own data after publishing them openly, and while differences in results can be the topic of a fruitful scientific discourse, fears such as these can be allayed by consulting the researchers who published the data before conducting the (re)analysis. A further reason for consulting researchers who created data, code, or materials is that they are knowledgeable about the resource and may be able to anticipate pitfalls in reuse strategies and propose improvements (Lo & DeMets, 2016). While the publication of a resource such as data, code, or materials generally does not in itself merit consideration for co-authorship on subsequent independent reports, it may be valuable to invite the resource originators into a discussion about the proposed new work. If the resource originators make an important academic contribution in this discussion, it is reasonable to consider offering coauthorship. What constitutes an important contribution can only be determined in relation to the case at hand; development of hypotheses, analytical strategies, and interpretations of results are examples that may fall in this category. Approaching open resources with an openness towards collaboration may, thus, help to increase value, as well as promoting a sharing culture. Bear in mind that offering co-authorship for researchers whose only contribution was to share their previously collected data with you on request disincentivizes public sharing.
Contact the originators of the resource beforehand and tell them about your plans
Consider including the original authors in a collaboration if they make an important academic contribution to your efforts
Do not include researchers as co-authors if their only contribution was sharing-on-request of previously collected data.
Cite both the resource and any associated papers
Address open science practices explicitly in assessments of merits, and appraise the value of open science contribution
9.4 When appraising scientific merits
The merit value of open science practices may strongly influence scientists’ behaviour. Therefore, when appraising merits, for example in decisions about hiring and awarding research grants, open science practices should be assessed. Some departments of psychology6 encourage or even require job advertisement to include a statement expressing commitment to open science7. Examples of such job advertisements are available here: https://osf.io/b9zks/.
Several factors affect the value of open scientific resources.
Indicators of higher value may include:
The resource is available in a trusted repository
The resource is comprehensively described
Data and metadata conform to discipline-specific standards
The resource appears, on the face of it, to have high reuse potential
There is evidence of reuse
such as those at Ludwig Maximilian University (Munich, Germany) and at the University of Oregon.↩
An example of such a statement is “such as “Our department embraces the values of open science and strives for replicable and reproducible research. For this goal we support transparent research with open data, open materials, and study pre-registration. Candidates are asked to describe in what way they already pursued and plan to pursue these goals.” (http://www.fak11.lmu.de/dep\_psychologie/osc/open-science-hiring-policy/index.html).↩