Content deleted Content added
m Link to article re: software r13y |
Citation bot (talk | contribs) Add: issue. | Use this bot. Report bugs. | #UCB_CommandLine |
||
Line 27:
==Measures of reproducibility and repeatability==
In chemistry, the terms reproducibility and repeatability are used with a specific quantitative meaning.<ref>{{Cite journal |last= |first= |title=IUPAC - reproducibility (R05305) |url=https://goldbook.iupac.org/terms/view/R05305 |access-date=2022-03-04 |website=[[International Union of Pure and Applied Chemistry]]|doi= 10.1351/goldbook.R05305|doi-access=free}}</ref> In inter-laboratory experiments, a concentration or other quantity of a chemical substance is measured repeatedly in different laboratories to assess the variability of the measurements. Then, the standard deviation of the difference between two values obtained within the same laboratory is called repeatability. The standard deviation for the difference between two measurement from different laboratories is called ''reproducibility''.<ref name="ASTM E177">{{cite web|url=https://www.astm.org/Standards/E177.htm |title=Standard Practice for Use of the Terms Precision and Bias in ASTM Test Methods |year=2014 |author=Subcommittee E11.20 on Test Method Evaluation and Quality Control |publisher=ASTM International |id=ASTM E177}}{{
These measures are related to the more general concept of [[variance component]]s in [[metrology]].
Line 48:
===Reproducible research in practice===
Psychology has seen a renewal of internal concerns about irreproducible results (see the entry on [[replicability crisis]] for empirical results on success rates of replications). Researchers showed in a 2006 study that, of 141 authors of a publication from the American Psychological Association (APA) empirical articles, 103 (73%) did not respond with their data over a six-month period.<ref>{{Cite journal|last1=Wicherts |first1=J. M. |last2=Borsboom |first2=D. |last3=Kats |first3=J. |last4=Molenaar |first4=D. |title=The poor availability of psychological research data for reanalysis |doi=10.1037/0003-066X.61.7.726 |journal=American Psychologist |volume=61 |issue=7 |pages=726–728 |year=2006 |pmid=17032082}}</ref> In a follow up study published in 2015, it was found that 246 out of 394 contacted authors of papers in APA journals did not share their data upon request (62%).<ref>{{Cite journal|last1=Vanpaemel |first1=W. |last2=Vermorgen |first2=M. |last3=Deriemaecker |first3=L. |last4=Storms |first4=G. |title=Are we wasting a good crisis? The availability of psychological research data after the storm |doi=10.1525/collabra.13 |journal=Collabra |volume=1 |issue=1 |pages=1–5 |year=2015 |doi-access=free}}</ref> In a 2012 paper, it was suggested that researchers should publish data along with their works, and a dataset was released alongside as a demonstration.<ref>{{Cite journal|last1=Wicherts |first1=J. M. |last2=Bakker |first2=M. |doi=10.1016/j.intell.2012.01.004 |title=Publish (your data) or (let the data) perish! Why not publish your data too? |journal=Intelligence |volume=40 |issue=2 |pages=73–76 |year=2012}}</ref> In 2017, an article published in ''[[Scientific Data (journal)|Scientific Data]]'' suggested that this may not be sufficient and that the whole analysis context should be disclosed.<ref>{{cite journal|last1=Pasquier|first1=Thomas|last2=Lau|first2=Matthew K.|last3=Trisovic|first3=Ana|last4=Boose|first4=Emery R.|last5=Couturier|first5=Ben|last6=Crosas|first6=Mercè|last7=Ellison|first7=Aaron M.|last8=Gibson|first8=Valerie|last9=Jones|first9=Chris R.|last10=Seltzer|first10=Margo|title=If these data could talk|journal=Scientific Data|date=5 September 2017|volume=4|issue=1 |pages=170114|doi=10.1038/sdata.2017.114|pmid=28872630|pmc=5584398|bibcode=2017NatSD...470114P}}</ref>
In economics, concerns have been raised in relation to the credibility and reliability of published research. In other sciences, reproducibility is regarded as fundamental and is often a prerequisite to research being published, however in economic sciences it is not seen as a priority of the greatest importance. Most peer-reviewed economic journals do not take any substantive measures to ensure that published results are reproducible, however, the top economics journals have been moving to adopt mandatory data and code archives.<ref>{{cite journal |last1=McCullough |first1=Bruce |title=Open Access Economics Journals and the Market for Reproducible Economic Research |journal=Economic Analysis and Policy |date=March 2009 |volume=39 |issue=1 |pages=117–126 |doi=10.1016/S0313-5926(09)50047-1|doi-access=free }}</ref> There is low or no incentives for researchers to share their data, and authors would have to bear the costs of compiling data into reusable forms. Economic research is often not reproducible as only a portion of journals have adequate disclosure policies for datasets and program code, and even if they do, authors frequently do not comply with them or they are not enforced by the publisher. A Study of 599 articles published in 37 peer-reviewed journals revealed that while some journals have achieved significant compliance rates, significant portion have only partially complied, or not complied at all. On an article level, the average compliance rate was 47.5%; and on a journal level, the average compliance rate was 38%, ranging from 13% to 99%.<ref>{{cite journal |last1=Vlaeminck |first1=Sven |last2=Podkrajac |first2=Felix |title=Journals in Economic Sciences: Paying Lip Service to Reproducible Research? |journal=IASSIST Quarterly |date=2017-12-10 |volume=41 |issue=1–4 |page=16 |doi=10.29173/iq6 |url=https://iassistquarterly.com/index.php/iassist/article/view/6/905|hdl=11108/359 |s2cid=96499437 |hdl-access=free }}</ref>
|