Reproducibility: Difference between revisions

Content deleted Content added
OAbot (talk | contribs)
m Open access bot: doi added to citation with #oabot.
Citation bot (talk | contribs)
Add: s2cid, eprint, class. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Smasongarrison | Category:Philosophy of science | #UCB_Category 138/205
Line 24:
Reproducibility in the original, wide sense is only acknowledged if a replication performed by an ''independent researcher team'' is successful.
 
Unfortunately, the terms reproducibility and replicability sometimes appear even in the scientific literature with reversed meaning,<ref>{{cite arXiv|title=Terminologies for Reproducible Research|last1=Barba|first1=Lorena A.|year=2018|arxivclass=cs.DL |eprint=1802.03311}}</ref><ref>{{cite web|title=Replicability vs. reproducibility — or is it the other way round?|last1=Liberman|first1=Mark|url=https://languagelog.ldc.upenn.edu/nll/?p=21956|access-date=2020-10-15}}</ref> when researchers fail to enforce the more precise usage.
 
==Measures of reproducibility and repeatability==
Line 33:
 
===Reproducible research method===
The term ''reproducible research'' refers to the idea that scientific results should be documented in such a way that their deduction is fully transparent. This requires a detailed description of the methods used to obtain the data<ref>{{Cite journal|last=King|first=Gary|date=1995|title=Replication, Replication|journal=PS: Political Science and Politics|volume=28|issue=3|pages=444–452|doi=10.2307/420301|jstor=420301|s2cid=250480339 |issn=1049-0965|url=http://nrs.harvard.edu/urn-3:HUL.InstRepos:4266312}}</ref><ref>{{cite journal|last1=Kühne |first1=Martin |last2=Liehr |first2=Andreas W. |year=2009 |title=Improving the Traditional Information Management in Natural Sciences |doi=10.2481/dsj.8.18 |journal=Data Science Journal |volume=8 |issue=1 |pages=18–27 |url=https://datascience.codata.org/jms/article/download/dsj.8.18/198 |doi-access=free}}</ref>
and making the full dataset and the code to calculate the results easily accessible.<ref>{{cite journal|last1=Fomel |first1=Sergey |author-link2=Jon Claerbout |last2=Claerbout |first2=Jon |year=2009 |title=Guest Editors' Introduction: Reproducible Research |journal=Computing in Science and Engineering |volume=11 |issue=1 |pages=5–7 |doi=10.1109/MCSE.2009.14 |bibcode=2009CSE....11a...5F}}</ref><ref name="buckheit1995" /><ref>{{cite journal|title=The Yale Law School Round Table on Data and Core Sharing: "Reproducible Research" |journal=Computing in Science and Engineering |volume=12 |issue=5 |pages=8–12 |doi=10.1109/MCSE.2010.113 |year=2010 |doi-access=free}}</ref><ref>{{cite journal|last1=Marwick |first1=Ben |year=2016 |title=Computational reproducibility in archaeological research: Basic principles and a case study of their implementation |journal=Journal of Archaeological Method and Theory |volume=24 |issue=2 |pages=424–450 |doi=10.1007/s10816-015-9272-9 |s2cid=43958561 |url=https://ro.uow.edu.au/smhpapers/4034}}</ref><ref>{{cite journal|last1=Goodman|first1=Steven N.|last2=Fanelli|first2=Daniele|last3=Ioannidis|first3=John P. A.|title=What does research reproducibility mean?|journal=Science Translational Medicine|date=1 June 2016|volume=8|issue=341|pages=341ps12|doi=10.1126/scitranslmed.aaf5027|pmid=27252173|doi-access=free}}</ref><ref>{{Cite journal|last1=Harris J.K|last2=Johnson K.J|last3=Combs T.B|last4=Carothers B.J|last5=Luke D.A|last6=Wang X|date=2019|title=Three Changes Public Health Scientists Can Make to Help Build a Culture of Reproducible Research|journal=Public Health Rep. Public Health Reports|volume=134|issue=2|pages=109–111|issn=0033-3549|oclc=7991854250|doi=10.1177/0033354918821076|pmid=30657732|pmc=6410469}}</ref>
This is the essential part of [[open science]].
Line 50:
Psychology has seen a renewal of internal concerns about irreproducible results (see the entry on [[replicability crisis]] for empirical results on success rates of replications). Researchers showed in a 2006 study that, of 141 authors of a publication from the American Psychological Association (APA) empirical articles, 103 (73%) did not respond with their data over a six-month period.<ref>{{Cite journal|last1=Wicherts |first1=J. M. |last2=Borsboom |first2=D. |last3=Kats |first3=J. |last4=Molenaar |first4=D. |title=The poor availability of psychological research data for reanalysis |doi=10.1037/0003-066X.61.7.726 |journal=American Psychologist |volume=61 |issue=7 |pages=726–728 |year=2006 |pmid=17032082}}</ref> In a follow up study published in 2015, it was found that 246 out of 394 contacted authors of papers in APA journals did not share their data upon request (62%).<ref>{{Cite journal|last1=Vanpaemel |first1=W. |last2=Vermorgen |first2=M. |last3=Deriemaecker |first3=L. |last4=Storms |first4=G. |title=Are we wasting a good crisis? The availability of psychological research data after the storm |doi=10.1525/collabra.13 |journal=Collabra |volume=1 |issue=1 |pages=1–5 |year=2015 |doi-access=free}}</ref> In a 2012 paper, it was suggested that researchers should publish data along with their works, and a dataset was released alongside as a demonstration.<ref>{{Cite journal|last1=Wicherts |first1=J. M. |last2=Bakker |first2=M. |doi=10.1016/j.intell.2012.01.004 |title=Publish (your data) or (let the data) perish! Why not publish your data too? |journal=Intelligence |volume=40 |issue=2 |pages=73–76 |year=2012}}</ref> In 2017, an article published in ''[[Scientific Data (journal)|Scientific Data]]'' suggested that this may not be sufficient and that the whole analysis context should be disclosed.<ref>{{cite journal|last1=Pasquier|first1=Thomas|last2=Lau|first2=Matthew K.|last3=Trisovic|first3=Ana|last4=Boose|first4=Emery R.|last5=Couturier|first5=Ben|last6=Crosas|first6=Mercè|last7=Ellison|first7=Aaron M.|last8=Gibson|first8=Valerie|last9=Jones|first9=Chris R.|last10=Seltzer|first10=Margo|title=If these data could talk|journal=Scientific Data|date=5 September 2017|volume=4|pages=170114|doi=10.1038/sdata.2017.114|pmid=28872630|pmc=5584398|bibcode=2017NatSD...470114P}}</ref>
 
In economics, concerns have been raised in relation to the credibility and reliability of published research. In other sciences, reproducibility is regarded as fundamental and is often a prerequisite to research being published, however in economic sciences it is not seen as a priority of the greatest importance. Most peer-reviewed economic journals do not take any substantive measures to ensure that published results are reproducible, however, the top economics journals have been moving to adopt mandatory data and code archives.<ref>{{cite journal |last1=McCullough |first1=Bruce |title=Open Access Economics Journals and the Market for Reproducible Economic Research |journal=Economic Analysis and Policy |date=March 2009 |volume=39 |issue=1 |pages=117–126 |doi=10.1016/S0313-5926(09)50047-1|doi-access=free }}</ref> There is low or no incentives for researchers to share their data, and authors would have to bear the costs of compiling data into reusable forms. Economic research is often not reproducible as only a portion of journals have adequate disclosure policies for datasets and program code, and even if they do, authors frequently do not comply with them or they are not enforced by the publisher. A Study of 599 articles published in 37 peer-reviewed journals revealed that while some journals have achieved significant compliance rates, significant portion have only partially complied, or not complied at all. On an article level, the average compliance rate was 47.5%; and on a journal level, the average compliance rate was 38%, ranging from 13% to 99%.<ref>{{cite journal |last1=Vlaeminck |first1=Sven |last2=Podkrajac |first2=Felix |title=Journals in Economic Sciences: Paying Lip Service to Reproducible Research? |journal=IASSIST Quarterly |date=2017-12-10 |volume=41 |issue=1–4 |page=16 |doi=10.29173/iq6 |url=https://iassistquarterly.com/index.php/iassist/article/view/6/905|hdl=11108/359 |s2cid=96499437 |hdl-access=free }}</ref>
 
A 2018 study published in the journal ''[[PLOS ONE]]'' found that 14.4% of a sample of public health researchers had shared their data or code or both.<ref>{{Cite journal|date=2018|title=Use of reproducible research practices in public health: A survey of public health analysts.|journal=PLOS ONE|volume=13|issue=9|pages=e0202447|issn=1932-6203|oclc=7891624396|bibcode=2018PLoSO..1302447H|last1=Harris|first1=Jenine K.|last2=Johnson|first2=Kimberly J.|last3=Carothers|first3=Bobbi J.|last4=Combs|first4=Todd B.|last5=Luke|first5=Douglas A.|last6=Wang|first6=Xiaoyan|doi=10.1371/journal.pone.0202447|pmid=30208041|pmc=6135378|doi-access=free}}</ref>