Tuesday, January 3, 2017

Open Data in Action

It's Tuesday, which means I've received my weekly email from the Association for Psychological Science, in which they provide links to new articles published in one of their journals, Psychological Science. I clicked on the first link, a fascinating article examining performance gap between upper-class and working class children, with the intention of blogging about it, when I noticed something interesting. Under the title information were three icons:


There was a link next to these icons where I could click for more information. Turns out these are "Open Practice" badges, which indicate whether the study authors have shared data and/or study materials. The authors of this study have shared both, and sure enough, links to data and materials are provided at the end of the article. I think we can expect more and more researchers to be willing to share their data and materials in this way.

The last badge, which the authors of this study did not receive, is a rather high bar, but also a really good thing:
Preregistered Badge*

URL, doi, or other permanent path to the registration in a public, open-access repository
An analysis plan registered prior to examination of the data or observing the outcomes
Any additional registrations for the study other than the one reported
Any changes to the preregistered analysis plan for the primary confirmatory analysis
All of the analyses described in the registered plan reported in the article

*Authors who have additional unreported registrations or unreported analyses without strong justification (as determined by the editor in chief) will not qualify for a badge.

If the analysis plan was registered prior to observation of outcomes, the Open Practices note will include the notation DE (Data Exist).

If there were strongly justified changes to an analysis plan, the Open Practices note will include the notation TC (Transparent Changes).

Basically, to qualify for this badge, researchers need to register their planned analyses in advance, and if they end up conducting additional analyses, make a strong justification for why. This is a great way to counteract p-hacking (see previous posts on p-hacking here and here). As I said, this is a high bar, and very few studies will likely qualify, but this is a great first step and a push to make better data practices the norm.

No comments:

Post a Comment