8 Preregistration

How should you pre-register your study? There has been growing awareness of pre-registration in recent years, but there are still few established guidelines to follow. In brief, an ideal pre-registration involves a written specification of your hypotheses, methods, and analyses, that you formally ‘register’ (create a time-stamped, read-only copy) on a public website, such that it can be viewed by the scientific community. Another form of pre-registration known as “Registered Reports” (Chambers, 2013; Hardwicke & Ioannidis, 2018), involves submitting your pre-registration to a journal where it undergoes peer-review, and may be offered in principle acceptance before you have even started the study, indicating that the article will be published pending successful completion of the study according to the methods and analytic procedures outlined, as well as a cogent interpretation of the results. This unique feature of Registered Reports may offer some remedy to the issue of publication bias because studies are accepted for publication based on the merits of the research question and the methodological quality of the design, rather than the outcomes (Chambers et al., 2014).

Really, it is up to you how much detail you put in your pre-registration and where you store it. But clearly, a more detailed (and reviewed) pre-registration will provide more constraint over the potential analytical flexibility, or ‘researcher degrees of freedom’, outlined above, and will therefore allow you and others to gain more confidence in the veracity of your findings. To get started, you may wish to use an established pre-registration template. The Open Science Framework (OSF) has several to choose from (for a brief tutorial on how to pre-register via the OSF, see https://osf.io/2vu7m/). In an OSF project, click on the “Registrations” tab and click “New Registration”. You will see a list of options. For example, there is a template that has been developed specifically for social psychology (van ’t Veer & Giner-Sorolla, 2016). For a simple and more general template you may wish to try the “AsPredicted preregistration”. This template asks you 9 key questions about your study, for example, “Describe the key dependent variable(s) specifying how they will be measured.”

One downside of templates is that they do not always cover important aspects of your study that you think should be pre-registered but the template creators have not anticipated. Templates can also be limited if you want to specify detailed analysis code within your pre-registration document. As a result, you may quickly find that you prefer to create your own custom pre-registration document (either from scratch or adapted from a template). Such a document can still be registered on the OSF, you just need to upload it to your OSF project as a regular file, and register it using the procedure outlined above, this time choosing the “OSF-Standard Pre-Data Collection Registration” option instead of one of the other templates.

After completing a template, or choosing to register a custom document, you will be asked if you would like to make the pre-registration public immediately, or set an embargo period of up to four years, after which the pre-registration will be made public. Note that the AsPredicted template mentioned above is actually based on a different website (https://aspredicted.org/) that provides its own registration service as an alternative to the OSF. If you use the AsPredicted service, all pre-registrations are private by default until they are explicitly made public by their owners. This may sound appealing, but it is potentially problematic: when registrations are private, the scientific community cannot monitor whether studies are being registered and not published (e.g., a file-drawer effect), or whether multiple, similar pre-registrations have been created. We would therefore recommend using the OSF, where all pre-registrations will eventually be made public after four years.

Once the registration process is complete (you and your collaborators may need to first respond to a confirmation e-mail), you will be able to see the frozen, read-only, time-stamped version of your project containing your pre-registration. You may need to click on the green “view registration” button if you used a template, or click on your custom pre-registration document in the “files” window to see the content itself. The url displayed in the address bar is a unique, persistent link to your pre-registration that you can include in your final published article.

When you write up your study, you should explicitly indicate which aspects were pre-registered and which were not. It is likely that some deviations from your plan were necessary. This is not problematic, simply note them explicitly and clearly, providing a rationale where possible. Where you were able to stick to the plan, these aspects of your study retain their full confirmatory status. Where deviations were necessary, you and your readers have the information they need to judge whether the deviation was justified. Three additional tools may be helpful in such cases. Firstly, one can anticipate some potential issues, and plan for them in advance using a ‘decision-tree’. For example, one might pre-specify that “if the data are normally distributed we will use a Student’s t-test, but if the data are not normally distributed we will use a Mann-Whitney U test”. Of course, the number of potential things that can “go wrong” and require deviation from the pre-specified plan are likely to multiply quite rapidly, and this approach can become untenable.

A more long-term solution is for an individual researcher or lab to write a “Standard Operating Procedures” (SOP) document, which specifies their default approach to handling various issues that may arise during the studies that they typically run (Lin & Green, 2016). For example, the document might specify which data points are considered “outliers” in reaction time data, and how those outliers are typically handled (e.g., excluded or retained). SOPs should also be registered, and either included along with your main pre-registration as an appendix or linked to directly. Of course, SOPs are only useful for issues that you have already anticipated and planned for, but it can be a valuable safety-net when you forget to include relevant information in your main pre-registration. SOPs can be continuously updated whenever new scenarios are encountered, such that there is a plan in place for future occasions.

Finally, a useful approach for handling unanticipated protocol deviations is to perform a sensitivity analysis (Thabane et al, 2013). Sensitivity analyses are employed when there are multiple reasonable ways of specifying an analysis. For example, how should one define exclusion criteria for outliers? In a sensitivity analysis, a researcher runs an analysis several times using different specifications (e.g., exclusion thresholds), and evaluates the impact of those specifications on the final outcome. An outcome is considered ‘robust’ if it remains stable under multiple reasonable analysis specifications. One might also consider running a multiverse analysis: a form of factorial sensitivity analysis where different specifications are simultaneously considered for multiple aspects of the analysis pipeline, giving a much more in depth picture of the robustness of the outcome under scrutiny (Steegen et al., 2016; also see Simonsohn et al., 2015). Indeed, multiverse analyses (and sensitivity analyses more broadly) are highly informative even when one has been able to stick to the pre-registered plan. To the extent that the pre-registered analysis plan included fairly arbitrary specifications, it is possible that that plan does not provide the most robust indication of the outcome under scrutiny. The gold standard here is to pre-register a plan for a multiverse analysis (Steegen et al., 2016).

The contents of this website is licensed under . When refering to this website please cite
Klein, O., Hardwicke, T. E., Aust, F., Breuer, J., Danielsson, H., Hofelich Mohr, A., … Frank, M. C. (2018). A Practical Guide for Transparency in Psychological Science. Collabra: Psychology, 4(1), 20. https://doi.org/10.1525/collabra.158