Content deleted Content added
m Reverted edit by Nicolas P. Rougier (talk) to last version by WikiCleanerBot |
Ozzie10aaaa (talk | contribs) m Cleaned up using AutoEd |
||
Line 41:
A basic workflow for reproducible research involves data acquisition, data processing and data analysis. Data acquisition primarily consists of obtaining primary data from a primary source such as surveys, field observations, experimental research, or obtaining data from an existing source. Data processing involves the processing and review of the raw data collected in the first stage, and includes data entry, data manipulation and filtering and may be done using software. The data should be digitized and prepared for data analysis. Data may be analysed with the use of software to interpret or visualise statistics or data to produce the desired results of the research such as quantitative results including figures and tables. The use of software and automation enhances the reproducibility of research methods.<ref>{{cite book |last1=Kitzes |first1=Justin |last2=Turek |first2=Daniel |last3=Deniz |first3=Fatma |title=The practice of reproducible research case studies and lessons from the data-intensive sciences |date=2018 |publisher=University of California Press |___location=Oakland, California |isbn=9780520294745 |pages=19–30 |jstor=10.1525/j.ctv1wxsc7 |url=http://www.jstor.org/stable/10.1525/j.ctv1wxsc7}}</ref>
There are systems that facilitate such documentation, like the [[R (programming language)|R]] [[Markdown]] language<ref>{{cite journal|last1=Marwick|first1=Ben|last2=Boettiger|first2=Carl|last3=Mullen|first3=Lincoln|title=Packaging data analytical work reproducibly using R (and friends)|journal=The American Statistician|volume=72|date=29 September 2017|pages=80–88|doi=10.1080/00031305.2017.1375986|s2cid=125412832|url=http://ro.uow.edu.au/cgi/viewcontent.cgi?article=6445&context=smhpapers}}</ref>
or the [[Jupyter]] notebook.<ref>{{cite conference|title=Jupyter Notebooks–a publishing format for reproducible computational workflows |url=https://eprints.soton.ac.uk/403913/1/STAL9781614996491-0087.pdf |archive-url=https://web.archive.org/web/20180110174609/https://eprints.soton.ac.uk/403913/1/STAL9781614996491-0087.pdf |archive-date=2018-01-10 |url-status=live |book-title=Positioning and Power in Academic Publishing: Players, Agents and Agendas |editor1-last=Loizides |editor1-first=F |editor2-last=Schmidt |editor2-first=B |publisher=IOS Press |last1=Kluyver |first1=Thomas |last2=Ragan-Kelley |first2=Benjamin |last3=Perez |first3=Fernando |last4=Granger |first4=Brian |last5=Bussonnier
|first5=Matthias |last6=Frederic |first6=Jonathan |last7=Kelley
Line 54:
A 2018 study published in the journal ''[[PLOS ONE]]'' found that 14.4% of a sample of public health statistics researchers had shared their data or code or both.<ref>{{Cite journal|date=2018|title=Use of reproducible research practices in public health: A survey of public health analysts.|journal=PLOS ONE|volume=13|issue=9|pages=e0202447|issn=1932-6203|oclc=7891624396|bibcode=2018PLoSO..1302447H|last1=Harris|first1=Jenine K.|last2=Johnson|first2=Kimberly J.|last3=Carothers|first3=Bobbi J.|last4=Combs|first4=Todd B.|last5=Luke|first5=Douglas A.|last6=Wang|first6=Xiaoyan|doi=10.1371/journal.pone.0202447|pmid=30208041|pmc=6135378|doi-access=free}}</ref>
There have been initiatives to improve reporting and hence reproducibility in the medical literature for many years, beginning with the [[Consolidated Standards of Reporting Trials|CONSORT]] initiative, which is now part of a wider initiative, the [[EQUATOR Network]].
This group has recently turned its attention to how better reporting might reduce waste in research,<ref>{{Cite web|title=Research Waste/EQUATOR Conference {{!}} Research Waste |url=http://researchwaste.net/research-wasteequator-conference/ |website=researchwaste.net |url-status=dead |archive-url=https://web.archive.org/web/20161029015313/http://researchwaste.net:80/research-wasteequator-conference/ |archive-date=29 October 2016}}</ref> especially biomedical research.
Line 106:
==Further reading==
* {{cite web|title = Scientists on Science: Reproducibility|date = October 2006|url = https://arstechnica.com/science/2006/10/5744/|author = Timmer, John|work = [[Ars Technica]]}}
* {{cite web|title = Is redoing scientific research the best way to find truth? During replication attempts, too many studies fail to pass muster |date = January 2015 |url = https://www.sciencenews.org/article/redoing-scientific-research-best-way-find-truth |author = Saey, Tina Hesman |work = [[Science News]]}} "Science is not irrevocably broken, [epidemiologist John Ioannidis] asserts. It just needs some improvements. "Despite the fact that
==External links==
{{Wiktionary}}
* [https://www.cos.io/our-services/top-guidelines Transparency and Openness Promotion Guidelines] from the [[Center for Open Science]]
* [https://www.nist.gov/pml/nist-technical-note-1297 Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results] of the [[National Institute of Standards and Technology]]
* [https://cTuning.org/ae Reproducible papers with artifacts] by the [[CTuning foundation]]
* [https://www.reproducibleresearch.net ReproducibleResearch.net]
{{Medical research studies}}
|