Evidence-based practice: Difference between revisions

Content deleted Content added
Cat4567nip (talk | contribs)
creating new category to bring several related articles together
Citation bot (talk | contribs)
Removed URL that duplicated identifier. | Use this bot. Report bugs. | #UCB_CommandLine
 
(469 intermediate revisions by more than 100 users not shown)
Line 1:
{{short description|Pragmatic methodology}}
{{Article issues|context =October 2009|refimprove =July 2009|disputed =October 2009}}
{{Evidence-based practices}}
 
'''Evidence-based practice''' is the idea that occupational practices ought to be based on [[scientific evidence]]. The movement towards evidence-based practices attempts to encourage and, in some instances, require professionals and other decision-makers to pay more attention to evidence to inform their decision-making. The goal of evidence-based practice is to eliminate unsound or outdated practices in favor of more-effective ones by shifting the basis for decision making from tradition, intuition, and unsystematic experience to firmly grounded scientific research.<ref>{{Cite journal|last=Leach|first=Matthew J.|s2cid=37311515|date=2006|title=Evidence-based practice: A framework for clinical practice and research design|journal=International Journal of Nursing Practice|language=en|volume=12|issue=5 |pages=248–251|doi=10.1111/j.1440-172X.2006.00587.x |pmid=16942511 |issn=1440-172X}}</ref> The proposal has been controversial, with some arguing that results may not specialize to individuals as well as traditional practices.<ref>For example: Trinder, L. and Reynolds, S. (eds) (2000) ''Evidence-Based Practice: A Critical Appraisal''. Oxford, Blackwell Science.</ref>
The core activities at the root of evidence-based medicine can be identified as:
* a questioning approach to practice leading to scientific experimentation
* meticulous observation, enumeration, and analysis replacing anecdotal case description
* recording and cataloguing the evidence for systematic retrieval.<ref>Peile, E. (2004) Reflections from medical practice: balancing evidence-based practice with practice based evidence. In, G. Thomas and R. Pring (Eds.) ''Evidence-based Practice in Education''. Open University Press.</ref>
 
Evidence-based practices have been gaining ground since the introduction of [[evidence-based medicine]] and have spread to the [[allied health professions]], [[evidence-based education|education]], [[evidence-based management|management]], [[evidence-based legislation|law]], [[evidence-based policy|public policy]], [[architecture]], and other fields.<ref>{{cite journal|title=Ranking of Risks for Existing and New Building Works|journal=Sustainability|volume=11|issue=10|pages=2863|doi=10.3390/su11102863|year=2019|last1=Li|first1=Rita Yi Man|last2=Chau|first2=Kwong Wing|last3=Zeng|first3=Frankie Fanjie|doi-access=free}}</ref> In light of studies showing problems in scientific research (such as the [[replication crisis]]), there is also a movement to apply evidence-based practices in scientific research itself. Research into the evidence-based practice of science is called [[metascience]].
Much of the credit for today’s techniques belongs to Archie Cochrane, an epidemiologist, who is best known for his influential book, ''Effectiveness and Efficiency: Random Reflections on Health Services''.<ref>Cochrane, A. (1972) ''Effectiveness and Efficiency. Random Reflections on Health Services''. London, Nuffield Provincial Hospitals Trust.</ref> Cochrane suggested that because resources would always be limited, they should be used to provide forms of health care which had been shown in properly designed evaluations to be effective.<ref> Cochrane Collaboration (2003) http://www.cochrane.org/cochrane/archieco.htm</ref> Cochrane maintained that the most reliable evidence was that which came from randomised controlled trials (RCTs).
 
An individual or organisation is justified in claiming that a specific practice is evidence-based if, and only if, three conditions are met. First, the individual or organisation possesses comparative evidence about the effects of the specific practice in comparison to the effects of at least one alternative practice. Second, the specific practice is supported by this evidence according to at least one of the individual's or organisation's preferences in the given practice area. Third, the individual or organisation can provide a sound account for this support by explaining the evidence and preferences that lay the foundation for the claim.<ref>{{Cite journal|last=Gade|first=Christian|date=2023|title=When is it justified to claim that a practice or policy is evidence-based? Reflections on evidence and preferences|journal=Evidence & Policy|volume=20 |issue=2 |pages=244–253 |doi=10.1332/174426421X16905606522863 |doi-access=free}} {{CC-notice|cc=by4}}</ref>
EBP promotes the collection, interpretation, and integration of valid, important and applicable patient-reported, clinician-observed, and research-derived evidence. The best available evidence, moderated by patient circumstances and preferences, is applied to improve the quality of clinical judgments and facilitate cost-effective care.
 
==History==
The term '''evidence-based treatment''' (EBT) or '''empirically-supported treatment''' ('''EST''') refers to preferential use of mental and behavioral health interventions for which systematic empirical research has provided evidence of statistically significant effectiveness as treatments for specific problems. In recent years, EBP has been stressed by professional organizations such as the American Psychological Association and the American Occupational Therapy Association, which have also strongly encouraged their members to carry out investigations to provide evidence supporting or rejecting the use of specific interventions. Pressure toward EBT has also come from public and private health insurance providers, which have sometimes refused coverage of practices lacking in systematic evidence of usefulness.
 
For most of history, professions have based their practices on expertise derived from experience passed down in the form of [[tradition]]. Many of these practices have not been justified by evidence, which has sometimes enabled [[quackery]] and poor performance.<ref>{{cite journal |last1=Bourgault |first1=Annette M. |last2=Upvall |first2=Michele J. |title=De-implementation of tradition-based practices in critical care: A qualitative study |journal=International Journal of Nursing Practice |date=2019 |volume=25 |issue=2 |pages=e12723 |doi=10.1111/ijn.12723|pmid=30656794 }}</ref> Even when overt quackery is not present, the quality and efficiency of tradition-based practices may not be optimal. As the [[scientific method]] has become increasingly recognized as a sound means to evaluate practices, evidence-based practices have become increasingly adopted.
Many areas of professional practice, such as medicine, psychology, psychiatry and so forth, have had periods in their pasts where practice was based on loose bodies of knowledge. Some of the knowledge was simply lore that drew upon the experiences of generations of practitioners, and much of it had no truly scientific evidence on which to justify various practices.
 
===Medicine===
In the past this has often left the door open to [[quackery]] perpetrated by individuals who had no training at all in the ___domain, but who wished to convey the impression that they did for profit or other motives. As the scientific method became increasingly recognized as the means to provide sound validation for such methods, it became clear that there needed to be a way of excluding quack practitioners not only as a way of preserving the integrity of the field (particularly medicine), but also of protecting the public from the dangers of their "cures." Furthermore, even where overt quackery was not present, it was recognized that there was a value in identifying what actually does work so it could be improved and promoted.
 
One of the earliest proponents of evidence-based practice was [[Archie Cochrane]], an [[epidemiologist]] who authored the book ''Effectiveness and Efficiency: Random Reflections on Health Services'' in 1972. Cochrane's book argued for the importance of properly testing health care strategies, and was foundational to the evidence-based practice of medicine.<ref>{{cite book |last=Cochrane |first=A.L. |title=Effectiveness and Efficiency. Random Reflections on Health Services |publisher=Nuffield Provincial Hospitals Trust |___location=London |year=1972 |isbn=978-0900574177 |oclc=741462 }}</ref> Cochrane suggested that because resources would always be limited, they should be used to provide forms of health care which had been shown in properly designed evaluations to be effective. Cochrane maintained that the most reliable evidence was that which came from [[randomised controlled trial]]s.<ref>Cochrane Collaboration (2003) http://www.cochrane.org/about-us/history/archie-cochrane {{Webarchive|url=https://web.archive.org/web/20210224135522/https://www.cochrane.org/about-us/history/archie-cochrane |date=2021-02-24 }}</ref>
The notion of evidence based practice has also had an influence in the field of education. Here, some commentators have suggested that the putative lack of any conspicuous progress is attributable to practice resting in the unconnected and uncumulative experience of thousands of individual teachers, each re-inventing the wheel and failing to learn from hard scientific evidence about 'what works'. Opponents of this view argue that hard scientific evidence is a misnomer in education; knowing that a drug works (in medicine) is entirely different from knowing that a teaching method works, for the latter will depend on a host of factors, not least those to do with the style, personality and beliefs of the teacher and the needs of the particular children. Thus, opponents of EBP in education suggest that all teachers do indeed need to develop their own personal practice, dependent on personal knowledge garnered through their own experience.<ref>Thomas, G. and Pring, R. (Eds.) (2004). ''Evidence-based Practice in Education''. Open University Press.</ref>
 
The term "[[evidence-based medicine]]" was introduced by [[Gordon Guyatt]] in 1990 in an unpublished program description, and the term was later first published in 1992.<ref>{{Cite web|title=Development of evidence-based medicine explored in oral history video|url=https://www.ama-assn.org/residents-students/residency/development-evidence-based-medicine-explored-oral-history-video|access-date=2020-12-23|website=American Medical Association|date=27 January 2014 |language=en}}</ref><ref>{{Cite journal|last1=Sackett|first1=D L|last2=Rosenberg|first2=W M|date=November 1995|title=The need for evidence-based medicine.|journal=Journal of the Royal Society of Medicine|volume=88|issue=11|pages=620–624|doi=10.1177/014107689508801105 |issn=0141-0768|pmc=1295384|pmid=8544145}}</ref><ref>{{Cite journal|last=Evidence-Based Medicine Working Group|date=1992-11-04|title=Evidence-based medicine. A new approach to teaching the practice of medicine|journal=JAMA|volume=268|issue=17|pages=2420–2425|doi=10.1001/jama.1992.03490170092032|issn=0098-7484|pmid=1404801}}</ref> This marked the first evidence-based practice to be formally established. Some early experiments in evidence-based medicine involved testing primitive medical techniques such as [[bloodletting]], and studying the effectiveness of modern and accepted treatments. There has been a push for evidence-based practices in medicine by [[insurance]] providers, which have sometimes refused coverage of practices lacking systematic evidence of usefulness. It is now expected by most clients that medical professionals should make decisions based on evidence, and stay informed about the most up-to-date information. Since the widespread adoption of evidence-based practices in medicine, the use of evidence-based practices has rapidly spread to other fields.<ref>{{cite web |title=A Brief History of Evidence-based Practice |url=https://www.eboptometry.com/content/optometry/article/brief-history-evidence-based-practice-0 |website=Evidence Based Practice in Optometry |publisher=[[University of New South Wales]] |access-date=24 June 2019}}</ref>
Evidence based treatment is an approach which tries to specify the way in which professionals or other decision-makers should make decisions by identifying such evidence that there may be for a practice, and rating it according to how scientifically sound it may be. Its goal is to eliminate unsound or excessively risky practices in favor of those that have better outcomes.
 
===Education===
EBT uses various methods (e.g. carefully summarizing research, putting out accessible research summaries, educating professionals in how to understand and apply research findings) to encourage, and in some instances to force, professionals and other decision-makers to pay more attention to evidence that can inform their decision-making. Where EBT is applied, it encourages professionals to use the best evidence possible, i.e. the most appropriate information available.
 
More recently, there has been a push for [[evidence-based education]]. The use of [[evidence-based learning]] techniques such as [[spaced repetition]] can improve students' rate of learning. Some commentators{{Who|date=August 2013}} have suggested that the lack of any substantial progress in the field of education is attributable to practice resting in the unconnected and noncumulative experience of thousands of individual teachers, each re-inventing the wheel and failing to learn from hard scientific evidence about 'what works'. Opponents of this view argue that it is hard to assess teaching methods because it depends on a host of factors, not least those to do with the style, personality and beliefs of the teacher and the needs of the particular children.<ref>Hammersley, M. (2013) ''The Myth of Research-Based Policy and Practice''. London: Sage.</ref> Others argue the teacher experience could be combined with research evidence, but without the latter being treated as a privileged source.<ref>Thomas, G. and Pring, R. (eds.) (2004). ''Evidence-based Practice in Education''. Open University Press.</ref> This is in line with a school of thought suggesting that evidence-based practice has limitations and a better alternative is to use ''Evidence-informed Practice (EIP)''. This process includes quantitative evidence, does not include non-scientific prejudices, but includes qualitative factors such as clinical experience and the discernment of practitioners and clients.<ref>{{Cite journal|url=https://doi.org/10.1093/bjsw/bcq149|title=The Myth of Evidence-Based Practice: Towards Evidence-Informed Practice|first1=Isaac|last1=Nevo|first2=Vered|last2=Slonim-Nevo|date=September 1, 2011|journal=The British Journal of Social Work|volume=41|issue=6|pages=1176–1197|via=Silverchair|doi=10.1093/bjsw/bcq149|url-access=subscription}}</ref><ref>{{Cite web|url=https://www.health.tas.gov.au/professionals/working-health-promoting-ways|title=Working in Health Promoting Ways|website=Tasmanian Department of Health|date=25 May 2022 }}</ref><ref>{{Cite web|url=https://www.researchgate.net/publication/260793333|title=Evidence-based Practice vs. Evidence-informed Practice: What's the Difference?}}</ref>
==Evidence Based Practice v. intuition==
Evidence-based practice (EBP) involves complex and conscientious decision-making which is based not only on the available evidence but also on patient characteristics, situations, and preferences. It recognizes that care is individualized and ever changing and involves uncertainties and probabilities.
 
==Versus tradition==
EBP develops individualized guidelines of best practices to inform the improvement of whatever professional task is at hand. Evidence-based practice is a philosophical approach that is in opposition to [[rules of thumb]], folklore, and [[tradition]]. Examples of a reliance on "the way it was always done" can be found in almost every profession, even when those practices are contradicted by new and better information.
 
However,Evidence-based inpractice spiteis ofa thephilosophical enthusiasmapproach for EBP evinced over the last decade or two, some authors have redefinedthat EBPis in ways that contradict, or at least add other factorsopposition to, the original emphasis on empirical research foundations[[tradition]]. ForSome example,degree EBPof may be defined as treatment choices based not onlyreliance on outcome"the researchway butit alsowas onalways practicedone" wisdomcan (thebe experiencefound ofin thealmost clinician)every andprofession, oneven familywhen valuesthose (thepractices preferencesare andcontradicted assumptionsby of a clientnew and hisbetter or her family or subculture)information.<ref name= Buysse2006>{{cite journal |last1=Buysse, |first1=V., & |last2=Wesley, |first2=P.W. (2006). |title=Evidence-based practice: How did it emerge and what does it really mean for the early childhood field? |journal=Zero to Three, |volume=27( |issue=2), 50|pages=50–55 |year=2006 |issn=0736-55.8038}}</ref>
 
Some critics argue that since research is conducted on a population level, results may not generalise to each individual within the population. Therefore, evidence-based practices may fail to provide the best solution for each individual, and traditional practices may better accommodate individual differences. In response, researchers have made an effort to test whether particular practices work better for different subcultures, personality types etc.<ref>{{cite journal |last1=de Groot |first1=M. |last2=van der Wouden |first2=J. M. |last3=van Hell |first3=E. A. |last4=Nieweg |first4=M. B. |title=Evidence-based practice for individuals or groups: let's make a difference |journal=Perspectives on Medical Education |date=31 July 2013 |volume=2 |issue=4 |pages=216–221 |doi=10.1007/s40037-013-0071-2 |pmid=24101580 |pmc=3792230 }}</ref> Some authors have redefined evidence-based practice to include practice that incorporates common wisdom, tradition, and personal values alongside practices based on evidence.<ref name= Buysse2006 />
Research oriented scientists, as oppposed to authors, test whether particular practices work better for different subcultures or personality types, rather than just accept [[received wisdom]]. For example, the MATCH Study run at many sites around the US by the National Institute on Alcohol Abuse and Alcoholism (NIAAA)tested whether particular types of clients with alcohol dependence would benefit differentially from three different treatment approaches to which they were randomly assigned<ref>Project MATCH Research Group. Matching Alcoholism treatments to client heterogeneity:Project MATCH posttreatment drinking outcomes. Journal of Studies in Alcoholism, 1997,58:7-29</ref> The idea was not to test the approaches but the matching of clients to treatments, and though this missed the question of client choice, it did demonstrate a lack of difference between the different approaches regardless of most client characteristics, with the exception that clients with high anger scores did better with the non-confrontational Motivational Enhancement approach which has been demonstrated superior in a meta-analysis of alcohol treatment outcome research and only required four as opposed to twelve session within Project MATCH.
 
==Evaluating evidence==
The theories of evidence based practice are becoming more commonplace in nursing care. Nurses who are “baccalaureate prepared are expected to seek out and collaborate with other types of nurses to demonstrate the positives of a practice that is based on evidence. ”Looking at a few types of articles to examine how this type of practice has influenced the standard of care is important but rarely internally valid. None of the articles specify what their biases are. Evidence based practice has gotten its reputation by examining the reasons why any and all procedures, treatments, and medicines are given. This is important for refining practice so the goal of assuring patient safety is met.<ref name="Duffy">{{cite journal |author=Duffy P, Fisher C, Munroe D |title=Nursing knowledge, skill, and attitudes related to evidenced based practice: Before or After Organizational Supports |journal=MEDSURG Nursing |month=February |year=2008 |volume=17 |issue=1 |pages=55–60}}</ref>
{{Further| Hierarchy of evidence}}
[[File:Research design and evidence.svg|thumb|Hierarchy of evidence in medicine.]]
Evaluating scientific research is extremely complex. The process can be greatly simplified with the use of a [[heuristic]] that [[ranking|ranks]] the relative strengths of results obtained from scientific research, which is called a [[hierarchy of evidence]]. The design of the study and the endpoints measured (such as survival or [[quality of life]]) affect the strength of the evidence. Typically, [[systematic review]]s and [[meta-analysis]] rank at the top of the hierarchy while [[randomized controlled trials]] rank above [[observational studies]], and [[expert opinion]] and [[case report]]s rank at the bottom. There is broad agreement on the relative strength of the different types of studies, but there is no single, universally-accepted hierarchy of evidence. More than 80 different hierarchies have been proposed for assessing medical [[evidence]].<ref>{{Cite news|url=https://www.sciencenews.org/blog/context/critique-medical-evidence-hierarchies|title=Philosophical critique exposes flaws in medical evidence hierarchies|last=Siegfried|first=Tom | name-list-style = vanc |date=2017-11-13|work=Science News|access-date=2018-05-16 }}</ref>
 
== Applications ==
==Research based evidence==
===Medicine===
[[Evidence-based design]] and development decisions are made after reviewing information from [[experiment|repeated rigorous data gathering]] instead of relying on [[expert system|rules]], single observations, or [[norm (sociology)|custom]].{{Citation needed|date=December 2007}} [[Evidence-based medicine]] and evidence-based nursing practice are the two largest fields employing this approach. In [[psychiatry]] and [[community mental health]], evidence-based practice guides have been created by such organizations as the [[Substance Abuse and Mental Health Services Administration]] and the [[Robert Wood Johnson Foundation]], in conjunction with the [[National Alliance on Mental Illness]]. Evidence-based practice has now spread into a diverse range of areas outside of health where the same principles are known by names such as results-focused policy, managing for outcomes, evidence-informed practice etc.
{{Further|Evidence-based medicine}}
Evidence-based medicine is an approach to medical practice intended to optimize [[decision-making]] by emphasizing the use of [[evidence]] from well-designed and well-conducted [[research]]. Although all medicine based on [[science]] has some degree of [[empirical evidence|empirical]] support, evidence-based medicine goes further, classifying evidence by its [[epistemology|epistemologic]] strength and requiring that only the strongest types (coming from [[meta-analysis|meta-analyses]], [[systematic review]]s, and [[randomized controlled trial]]s) can yield strong recommendations; weaker types (such as from [[case-control study|case-control studies]]) can yield only weak recommendations. The term was originally used to describe an approach to teaching the practice of medicine and improving decisions by individual physicians about individual patients.<ref name="Guyatt">{{cite journal |title=Evidence-based medicine. A new approach to teaching the practice of medicine |journal=JAMA |volume=268 |issue=17 |pages=2420–25 |date=November 1992 |pmid=1404801 |doi= 10.1001/JAMA.1992.03490170092032 |author1= Evidence-Based Medicine Working Group|citeseerx=10.1.1.684.3783 }}</ref> Use of the term rapidly expanded to include a previously described approach that emphasized the use of evidence in the design of guidelines and policies that apply to groups of patients and populations ("evidence-based practice policies").<ref name="Eddy">{{cite journal |author=Eddy DM |title = Practice Policies – Where Do They Come from?|journal=Journal of the American Medical Association |year=1990 |volume=263 |issue=9 |pages = 1265, 1269, 1272, 1275 |doi = 10.1001/jama.263.9.1265 |pmid=2304243 }}</ref>
 
Whether applied to medical education, decisions about individuals, guidelines and policies applied to populations, or administration of health services in general, evidence-based medicine advocates that to the greatest extent possible, decisions and policies should be based on evidence, not just the beliefs of practitioners, experts, or administrators. It thus tries to ensure that a [[clinician]]'s opinion, which may be limited by knowledge gaps or biases, is supplemented with all available knowledge from the [[scientific literature]] so that [[best practice]] can be determined and applied. It promotes the use of formal, explicit methods to analyze evidence and makes it available to decision makers. It promotes programs to teach the methods to medical students, practitioners, and policymakers.
This model of care has been studied for 30 years in universities and is gradually making its way into the public sector.{{Citation needed|date=December 2007}} It effectively moves away from the old “medical model” (You have a disease, take this pill.) to a “evidence presented model” using the patient as the starting point in diagnosis. EBPs are being employed in the fields of health care, juvenile justice, mental health and social services among others. The theories of evidence based practice are becoming more commonplace in the nursing care. Nurses who are “baccalaureate prepared are expected to seek out and collaborate with other types of nurses to demonstrate the positives of a practice that is based on evidence.”<ref name="Duffy"/>
 
A process has been specified that provides a standardised route for those seeking to produce evidence of the effectiveness of interventions.<ref>{{citation |last=Vine |first=Jim |title=Standard for Producing Evidence – Effectiveness of Interventions – Part 1: Specification (StEv2-1) |publisher=HACT |year=2016 |isbn=978-1-911056-01-0 |at=[http://www.hact.org.uk/standards-evidence-housing Standards of Evidence]}}</ref> Originally developed to establish processes for the production of evidence in the housing sector, the standard is general in nature and is applicable across a variety of practice areas and potential outcomes of interest.
Key elements in using the best evidence to guide the practice of any professional include the development of questions using research-based evidence, the level and types of evidence to be used, and the assessment of effectiveness after completing the task or effort. One obvious problem with EBP in any field is the use of poor quality, contradictory, or incomplete evidence. Evidence-based practice continues to be a developing body of work for professions as diverse as [[education]], [[psychology]], [[economics]], [[nursing]], [[social work]] and [[architecture]].
 
===PsychologyMental health===
To improve the dissemination of evidence-based practices, the [http://www.abct.org/ Association for Behavioral and Cognitive Therapies (ABCT)] and the Society of Clinical Child and Adolescent Psychology ([[Society of Clinical Child and Adolescent Psychology|SCCAP]], [[Divisions of the American Psychological Association|Division 53]] of the [[American Psychological Association]])<ref>{{Cite web|url=https://sccap53.org/|title=SCCAP Division 53 – The Society for Child Clinical and Adolescent Psychology}}</ref> maintain updated information on their websites on evidence-based practices in psychology for practitioners and the general public. An evidence-based practice consensus statement was developed at a summit on mental healthcare in 2018. As of June 23, 2019, this statement has been endorsed by 36 organizations.
{{Expand|date=December 2007}}
According to Norcross et al. (2006) "the burgeoning evidence based practice movement in mental health attempts to identity, implement, and disseminate treatments that have been proven demonstrably effective according to the empirical evidence". However, Norcross et al. (2006) also state that perhaps it is more useful to identify what does not work. They conducted a survey rating experts opinions of "not at all discredited" to "certainly discredited" for a range of treatments. Examples of the range of discredited psychotherapies includes: [[angel therapy]], the use of pyramid structures, [[orgone]] therapy, past lives therapy, [[chiropractic]] manipulation, and [[Erhard Seminars Training]]. The limitation to the study was that some subjects may not have been discredited even though there was no evidence for efficacy. It was recommended that future polls take this into consideration, though the researchers concluded that the study does identify the dark side or "quack factor" of modern mental health practice.<ref name="Norcross 2006">Norcross, JC, Garofalo.A, Koocher.G. (2006) Discredited Psychological Treatments and Tests; A Delphi Poll. Professional Psychology; Research and Practice. vol37. No 5. 515-522 {{doi|10.1037/0735-7028.37.5.515}}</ref>
 
===Metascience===
===Levels of evidence and evaluation of research===
{{Main|Metascience}}
Because conclusions about research results are made in a probabilistic manner, it is impossible to work with two simple categories of outcome research reports. Research evidence does not fall simply into "evidence-based" and "non-evidence-based" classes, but can be anywhere on a continuum from one to the other, depending on factors such as the way the study was designed and carried out. The existence of this continuum makes it necessary to think in terms of "levels of evidence", or categories of stronger or weaker evidence that a treatment is effective. To classify a research report as strong or weak evidence for a treatment, it is necessary to evaluate the quality of the research as well as the reported outcome.<ref> Mercer, J., & Pignotti, M. (2007). Shortcuts cause errors in systematic research syntheses: Rethinking evaluation of mental health interventions. Scientific Review of Mental Health Practice, 5(2), 59-77.</ref>
There has since been a movement for the use of evidence-based practice in conducting scientific research in an attempt to address the [[replication crisis]] and other major issues affecting scientific research.<ref>{{cite web |last1=Rathi |first1=Akshat |title=Most science research findings are false. Here's how we can change that |url=https://qz.com/530064/most-science-research-findings-are-false-heres-how-we-can-change-that/ |website=Quartz |date=22 October 2015 |access-date=13 June 2019 |language=en}}</ref> The application of evidence-based practices to research itself is called [[metascience]], which seeks to increase the quality of scientific research while reducing waste. It is also known as "research on research" and "the science of science", as it uses [[research methods]] to study how research is done and where improvements can be made. The five main areas of research in metascience are methodology, reporting, [[reproducibility]], [[scholarly peer review|evaluation]], and incentives.<ref name=Ioannidis2015>{{cite journal |last1=Ioannidis |first1=John P. A. |last2=Fanelli |first2=Daniele |last3=Dunne |first3=Debbie Drake |last4=Goodman |first4=Steven N. |title=Meta-research: Evaluation and Improvement of Research Methods and Practices |journal=PLOS Biology |date=2 October 2015 |volume=13 |issue=10 |pages=e1002264 |doi=10.1371/journal.pbio.1002264 |pmid=26431313 |pmc=4592065 |issn=1544-9173 |doi-access=free }}</ref> Metascience has produced a number of reforms in science such as the use of [[pre-registration (science)|study pre-registration]] and the implementation of [[EQUATOR Network|reporting guidelines]] with the goal of bettering scientific research practices.<ref>{{Cite journal| doi = 10.1371/journal.pbio.1002264| pmid = 26431313| pmc = 4592065| issn = 1545-7885| volume = 13| issue = 10| pages = –1002264| last1 = Ioannidis| first1 = John P. A.| last2 = Fanelli| first2 = Daniele| last3 = Dunne| first3 = Debbie Drake| last4 = Goodman| first4 = Steven N.| title = Meta-research: Evaluation and Improvement of Research Methods and Practices| journal = PLOS Biology| date = 2015-10-02| doi-access = free}}</ref>
 
===Education===
Evaluation of research quality can be a difficult task requiring meticulous reading of research reports and background information. It may not be appropriate simply to accept the conclusion reported by the researchers; for example, in one investigation of outcome studies, 70% were found to have stated conclusions unjustified by their research design.<ref> Rubin, A., & Parrish, D. (2007). Problematic phrases in the conclusions of published outcome studies. Research on Social Work Practice, 17 (3), 334-347.</ref>
{{Main|Evidence-based education}}
 
'''Evidence-based education''' (EBE), also known as ''evidence-based interventions'', is a model in which policy-makers and educators use empirical evidence to make informed decisions about education interventions (policies, practices, and programs).<ref>Trinder, L. and Reynolds, S. (eds) (2000) ''Evidence-Based Practice: A critical appraisal'', Oxford, Blackwell Science.</ref> In other words, decisions are based on scientific evidence rather than opinion.
Although early consideration of EBP issues by psychologists provided a stringent but simple definition of EBP, requiring two independent randomized controlled trials supporting the effectiveness of a treatment,<ref> Chambless, D., & Hollon, S. (1998). Defining empirically supportable therapies. Journal of Consulting and Clinical Psychology, 66, 7-18.</ref> it became clear that additional factors needed to be considered. These included both the need for lower but still useful levels of evidence, and the need to require even the "gold standard" randomized trials to meet further criteria.
 
EBE has gained attention since English author [[David H. Hargreaves]] suggested in 1996 that education would be more effective if teaching, like medicine, was a "research-based profession".<ref>{{Cite web|url=https://www.researchgate.net/publication/271952371|title=Knowledge creation as an approach to facilitating evidence informed practice: Examining ways to measure the success of using this method with early years practitioners in Camden (London)}}</ref>
A number of protocols for the evaluation of research reports have been suggested and will be summarized here. Some of these divide research evidence dichotomously into EBP and non-EBP categories, while others employ multiple levels of evidence. As the reader will see, although the criteria used by the various protocols overlap to some extent, they do not do so completely.
 
Since 2000, studies in Australia, England, Scotland and the US have supported the use of research to improve educational practices in teaching reading.<ref name="Teaching Reading">{{cite web |url=http://research.acer.edu.au/cgi/viewcontent.cgi?filename=2&article=1004&context=tll_misc&type=additional |title=Teaching Reading |format=PDF |work= Australian Government Department of Education, Science and Training. }}</ref><ref>{{cite web|url=http://publications.teachernet.gov.uk/eOrderingDownload/0201-2006PDF-EN-01.pdf|title=Independent review of the teaching of early reading, 2006|access-date=2020-07-31|archive-date=2010-05-12|archive-url=http://webarchive.nationalarchives.gov.uk/20100512233640/http://publications.teachernet.gov.uk/eOrderingDownload/0201-2006PDF-EN-01.pdf|url-status=dead}}</ref><ref>{{Citation |archive-url=https://web.archive.org/web/20150222153240/http://www.gov.scot/Publications/2005/02/20682/52383 |archive-date=2015-02-22 |title=Insight 17 - A seven year study of the effects of synthetic phonics teaching on reading and spelling attainment |publisher=IAC:ASU Schools |last1=Johnston |first1=Rhona S |last2=Watson |first2=Joyce E |issn=1478-6796 |url=http://www.gov.scot/Publications/2005/02/20682/52383 }}</ref>
The Kaufman Best Practices Project approach did not use an EBP category per se, but instead provided a protocol for selecting the most acceptable treatment from a group of interventions intended to treat the same problems.<ref> Kaufman Best Practices Project. (2004). Kaufman Best Practices Project Final Report: Closing the Quality Chasm in Child Abuse Treatment; Identifying and Disseminating Best Practices. Retrieved July 20, 2007, from http://academicdepartments.musc.edu/ncvc/resources_prof/reports_prof.thm.</ref> To be designated as "best practice", a treatment would need to have a sound theoretical base, general acceptance in clinical practice, and considerable anecdotal or clinical literature. This protocol also requires absence of evidence of harm, at least one randomized controlled study, descriptive publications, a reasonable amount of necessary training, and the possibility of being used in common settings. Missing from this protocol are the possibility of nonrandomized designs (in which clients or practitioners decide whether an individual will receive a certain treatment), the need to specify the type of comparison group used, the existence of confounding variables, the reliability or validity of outcome measures, the type of statistical analysis required, or a number of other factors required by some evaluation protocols.<ref> Mercer, J., & Pignotti, M. (2007). Shortcuts cause errors in systematic research syntheses. Scientific Review of Mental Health Practice, 5(2), 59-77.</ref>
 
In 1997, the [[National Institute of Child Health and Human Development]] convened a national panel to assess the effectiveness of different approaches used to teach children to read. The resulting [[National Reading Panel]] examined quantitative research studies on many areas of reading instruction, including phonics and whole language. In 2000 it published a report entitled ''Teaching Children to Read: An Evidence-based Assessment of the Scientific Research Literature on Reading and its Implications for Reading Instruction'' that provided a comprehensive review of what was known about best practices in reading instruction in the U.S.<ref>{{cite web |url=http://www.nationalreadingpanel.org/publications/summary.htm |title=National Reading Panel (NRP) – Publications and Materials – Summary Report |year=2000 |work=National Institute of Child Health and Human Development. (2000). Report of the National Reading Panel. Teaching children to read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction (NIH Publication No. 00-4769). Washington, DC: U.S. Government Printing Office. |url-status=dead |archive-url=https://web.archive.org/web/20100610211113/http://www.nationalreadingpanel.org/publications/summary.htm |archive-date=2010-06-10 }}</ref><ref>{{cite web |url=http://www.nationalreadingpanel.org/publications/subgroups.htm |title=National Reading Panel (NRP) – Publications and Materials – Reports of the Subgroups |year=2000 |work=National Institute of Child Health and Human Development. (2000). Report of the National Reading Panel. Teaching children to read: an evidence-based assessment of the scientific research literature on reading and its implications for reading instruction: Reports of the subgroups (NIH Publication No. 00-4754). Washington, DC: U.S. Government Printing Office |url-status=dead |archive-url=https://web.archive.org/web/20100611011153/http://www.nationalreadingpanel.org/publications/subgroups.htm |archive-date=2010-06-11 }}</ref><ref>{{Cite web|url=https://www.nichd.nih.gov/sites/default/files/publications/pubs/Documents/PRFbooklet.pdf|title=Teacher's Guide, Put Reading First - K-3, NICHD, edpubs@inet.ed.gov}}</ref>
A protocol suggested by Saunders et al.<ref> Saunders, B., Berliner, L., & Hanson, R. (2004). Child physical and sexual abuse: Guidelines for treatments. Retrieved September 15, 2006, from http://www.musc.edu/cvc.guidel.htm</ref> assigns research reports to six categories, on the basis of research design, theoretical background, evidence of possible harm, and general acceptance. To be classified under this protocol, there must be descriptive publications, including a manual or similar description of the intervention. This protocol does not consider the nature of any comparison group, the effect of confounding variables, the nature of the statistical analysis, or a number of other criteria. Interventions are assessed as belonging to Category 1, well-supported, efficacious treatments, if there are two or more randomized controlled outcome studies comparing the target treatment to an appropriate alternative treatment and showing a significant advantage to the target treatment. Interventions are assigned to Category 2, supported and probably efficacious treatment, based on positive outcomes of nonrandomized designs with some form of control, which may involve a non-treatment group. Category 3, supported and acceptable treatment, includes interventions supported by one controlled or uncontrolled study, or by a series of single-subject studies, or by work with a different population than the one of interest. Category 4, promising and acceptable treatment, includes interventions that have no support except general acceptance and clinical anecdotal literature; however, any evidence of possible harm excludes treatments from this category. Category 5, innovative and novel treatment, includes interventions that are not thought to be harmful, but are not widely used or discussed in the literature. Category 6, concerning treatment, is the classification for treatments that have the possibility of doing harm, as well as having unknown or inappropriate theoretical foundations.
 
This occurred around the same time as such international studies as the [[Programme for International Student Assessment]] in 2000 and the [[Progress in International Reading Literacy Study]] in 2001.
A protocol for evaluation of research quality was suggested by a report from the Centre for Reviews and Dissemination, prepared by Khan et al. and intended as a general method for assessing both medical and psychosocial interventions.<ref> Khan, K.S., et al. (2001). CRD Report 4. Stage II. Conducting the review. phase 5. Study quality assessment. York, UK: Centre for Reviews and Dissemination, University of York. Retrieved July 20, 2007 from http://www.york.ac.uk/inst/crd/pdf/crd_4ph5.pdf</ref> While strongly encouraging the use of randomized designs, this protocol noted that such designs were useful only if they met demanding criteria, such as true randomization and concealment of the assigned treatment group from the client and from others, including the individuals assessing the outome. The Khan et al. protocol emphasized the need to make comparisons on the basis of "intention to treat" in order to avoid problems related to greater attrition in one group. The Khan et al. protocol also presented demanding criteria for nonrandomized studies, including matching of groups on potential confounding variables and adequate descriptions of groups and treatments at every stage, and concealment of treatment choice from persons assessing the outcomes. This protocol did not provide a classification of levels of evidence, but included or excluded treatments from classification as evidence-based depending on whether the research met the stated standards.
 
Subsequently, evidence-based practice in education (also known as [[Scientifically based research]]), came into prominence in the U.S. under the [[No child left behind act]] of 2001, replace in 2015 by the [[Every Student Succeeds Act]].
An assessment protocol has been developed by the U.S. National Registry of Evidence-Based Practices and Programs (NREPP).<ref> National Registry of Evidence-Based Practices and Programs (2007). NREPP Review Criteria. Retrieved March 10, 2008 from http://www.nrepp.samsha.gov/review-criteria.htm</ref> Evaluation under this protocol occurs only if an intervention has already had one or more positive outcomes, with a probability of less than .05, reported, if these have been published in a peer-reviewed journal or an evaluation report, and if documentation such as training materials has been made available. The NREPP evaluation, which assigns quality ratings from 1 to 4 to certain criteria, examines reliability and validity of outcome measures used in the research, evidence for intervention fidelity (predictable use of the treatment in the same way every time), levels of missing data and attrition, potential confounding variables, and the appropriateness of statistical handling, including sample size.
 
In 2002 the [[U.S. Department of Education]] founded the [[Institute of Education Sciences]] to provide scientific evidence to guide education practice and policy .
A protocol suggested by Mercer and Pignotti<ref> Mercer, J., & pignotti, M. (2007). Shortcuts cause errors in systematic research syntheses. Scientific Review of Mental Health Practice, 5(2), 59-77</ref> uses a taxonomy intended to classify on both research quality and other criteria. In this protocol, evidence-based interventions are those supported by work with randomized designs employing comparisons to established treatments, independent replications of results, blind evaluation of outcomes, and the existence of a manual. Evidence-supported interventions are those supported by nonrandomized designs, including within-subjects designs, and meeting the criteria for the previous category. Evidence-informed treatments involve case studies or interventions tested on populations other than the targeted group, without independent replications; a manual exists, and there is no evidence of harm or potential for harm. Belief-based interventions have no published research reports or reports based on composite cases; they may be based on religious or ideological principles or may claim a basis in accepted theory without an acceptable rationale; there may or may not be a manual, and there is no evidence of harm or potential for harm. Finally, the category of potentially harmful treatments includes interventions such that harmful mental or physical effects have been documented, or a manual or other source shows the potential for harm.
 
English author [[Ben Goldacre]] advocated in 2013 for systemic change and more [[randomized controlled trial]]s to assess the effects of educational interventions.<ref>{{Cite web|url=https://www.gov.uk/government/news/building-evidence-into-education|title=Building Evidence into Education|website=gov.uk}}</ref> In 2014 the [[National Foundation for Educational Research]], Berkshire, England<ref>{{Cite web|url=https://www.nfer.ac.uk/|title=Home|website=NFER}}</ref> published a report entitled ''Using Evidence in the Classroom: What Works and Why''.<ref>{{cite web|url=https://www.nfer.ac.uk/publications/impa01/impa01.pdf|title= Using Evidence in the Classroom: What Works and Why, Nelson, J. and O'Beirne, C. (2014). Slough: NFER. ISBN 978-1-910008-07-2}}</ref> In 2014 the [[British Educational Research Association]] and the [[Royal Society of Arts]] advocated for a closer working partnership between teacher-researchers and the wider academic research community.<ref>{{Cite web|url=https://www.bera.ac.uk/wp-content/uploads/2014/02/BERA-RSA-Interim-Report.pdf|title=The role of research in teacher education: reviewing the evidence-BERA-RSA, January 2014}}</ref><ref>{{Cite web|url=https://www.bera.ac.uk/project/research-and-teacher-education|title=Research and Teacher Education|website=www.bera.ac.uk}}</ref>
Protocols for evaluation of research quality are still in dvelopment. So far, the available protocols pay relatively little attention to whether outcome research is relevant to efficacy (the outcome of a treatment performed under ideal conditions) or to effectiveness (the outcome of the treatment performed under ordinary, expectable conditions).
 
===Meta-analyses=Reviews andof systematicexisting research syntheseson education====
The following websites offer free analysis and information on education research:
* ''The Best Evidence Encyclopedia''<ref>{{Cite web|url=https://bestevidence.org/|title=Best Evidence Encyclopedia|website=Best Evidence Encyclopedia}}</ref> is a free website created by the [[Johns Hopkins University]] School of Education's Center for Data-Driven Reform in Education (established in 2004) and is funded by the [[Institute of Education Sciences]], U.S. Department of Education. It gives educators and researchers reviews about the strength of the evidence supporting a variety of English programs available for students in grades [[K–12]]. The reviews cover programs in areas such as ''Mathematics, Reading, Writing, Science, Comprehensive school reform, and Early childhood Education''; and include such topics as the ''effectiveness of technology and struggling readers''.
* ''The [[Education Endowment Foundation]]'' was established in 2011 by The [[Sutton Trust]], as a lead charity in partnership with Impetus Trust, together being the government-designated What Works Centre for UK Education.<ref>{{Cite web|url=https://educationendowmentfoundation.org/|title=Loading...|website=educationendowmentfoundation.org}}</ref>
* ''Evidence for the Every Student Succeeds Act''<ref>{{cite web|url=https://www.evidenceforessa.org|title=Evidence for ESSA}}</ref> began in 2017 and is produced by the Center for Research and Reform in Education<ref>{{Cite web|url=https://education.jhu.edu/crre/|title=Center for Research and Reform in Education|date=11 January 2024 }}</ref> at [[Johns Hopkins University School of Education]]. It offers free up-to-date information on current PK-12 programs in reading, writing, math, science, and others that meet the standards of the [[Every Student Succeeds Act]] (the United States K–12 public education policy signed by President Obama in 2015).<ref>{{Cite web|url=https://www.ed.gov/essa?src=rn|title=Every Student Succeeds Act (ESSA) &#124; U.S. Department of Education|website=www.ed.gov}}</ref> It also provides information on programs that do meet the Every Student Succeeds Act standards as well as those that do not.
* ''What Works Clearinghouse'',<ref name="auto">{{Cite web|url=https://ies.ed.gov/ncee/wwc/|title=WWC &#124; Find What Works!|website=ies.ed.gov}}</ref> established in 2002, evaluates numerous educational programs, in twelve categories, by the quality and quantity of the evidence, and the effectiveness. It is operated by the federal National Center for Education Evaluation, and Regional Assistance, part of the [[Institute of Education Sciences]]<ref name="auto"/>
* ''Social programs that work'' is administered by [[Arnold Ventures LLC]]'s Evidence-Based Policy team. The team is composed of the former leadership of the ''Coalition for Evidence-Based Policy'', a nonprofit, nonpartisan organization advocating the use of well-conducted ''[[randomized controlled trial]]s'' (RCTs) in policy decisions.<ref>http://toptierevidence.org/ Social programs that work</ref> It offers information on twelve types of social programs including education.
 
[[Evidence-based education#Other sources of information|A variety of other organizations]] offer information on research and education.
When there are many small or weak studies of an intervention, a statistical meta-analysis can be used to co-ordinate the studies' results and to draw a stronger conclusion about the outcome of the treatment. This can be an important contribution to the establishment of a foundation of evidence about an intervention.
 
In other situations, facts about a group of study outcomes may be gathered and discussed in the form of a systematic research synthesis (SRS).<ref> Cooper, H. (2003). Editorial. Psychological Bulletin, 129, 3-9.</ref> A SRS can be more or less useful, depending on the evaluation protocol chosen, and errors in choice or use of a protocol have led to fallacious reports.<ref> Pignotti, M., & Mercer, J. (2007). Holding Therapy and Dyadic Developmental Psychotherapy are not supported and acceptable social work interventions. Research on Social Work practice, 17(4), 513-519.</ref> The meaningfulness of a SRS report on an intervention is limited by the quality of the research under consideration, but SRS reports can be helpful to readers seeking to understand EBP-related choices.
 
Miller et al. provide an excellent example and explication of the use of meta-analysis examining treatment outcome research, incorporating the principles of rigorous empirical research from the strong end of the continuum of levels of evidence.<ref> Chapter 2 "What Works? A summary of alcohol treatment outcome research" in Hester, R. & Miller, W.R., Handbook of alcoholism treatment approaches: Effective alternatives. Miller, W. R., Wilbourne, P.L. & Hettema, J.E.;3rd ed.; 2003, Allyn & Bacon, pp. 13-63 summary table http://www.behaviortherapy.com/whatworks.htm</ref> This textbook also explicates how the research included was selected (e.g. controlled study looking at two different approaches, appearing in a peer reviewed journal, sufficient power to find significant differences if they occurred) and how each study was checked for validity (how was the outcome measured?) and reliability (did the research do what they said they did?), etc. to create a Cummulative Evidence Score weighted by the quality of the study (and not by the outcome) such that better studies with "stronger designs" and better methodological quality ratings carry more weight than weaker studies.The results lead to a rank ordering of the 48 treatment modalities included and provide a basis for selecting supportable treatment approaches beyond anecdotes, traditions and lore.
 
===Social Policy===
There are increasing demands for the whole range of social policy and other decisions and programs run by government and the NGO sector to be based on sound evidence as to their effectiveness. This has seen an increased emphasis on the use of a wide range of [[Evaluation approaches]] directed at obtaining evidence about social programs of all types. A research collaboration called the [http://www.campbellcollaboration.org Campbell Collaboration] has been set up in the social policy area to provide evidence for evidence-based social policy decision-making. This collaboration follows the approach pioneered by the Cochrane Collaboration in the health sciences.<ref>http://www.cochrane.org</ref> Using an evidence-based approach to social policy has a number of advantages because it has the potential to decrease the tendency to run programs which are socially acceptable (e.g. drug education in schools) but which often prove to be ineffective when evaluated.<ref name="raines">{{cite book|author=Raines, J. C.|year=2008|title= Evidence-based practice in school mental health|___location= New York|publisher= Oxford University Press}}
</ref>
 
==See also==
{{col div|colwidth=30em}}
*[[Epidemiology]]
* [[Evidence-based designassessment]]
* [[Evidence-based managementconservation]]
* [[Evidence-based medicinedentistry]]
* [[Evidence-based pharmacy in developing countriesdesign]]
* [[Evidence-based education]]
*[[Dynamic treatment regimes]]
* [[Evidence-based legislation]]
* [[Evidence-based library and information practice]]
* [[Evidence-based management]]
* [[Evidence-based medical ethics]]
* [[Evidence-based medicine]]
* [[Evidence-based nursing]]
* [[Evidence-based pharmacy in developing countries]]
* [[Evidence-based philanthropy]]—effective altruism
* [[Evidence-based policing]]
* [[Evidence-based policy]]
* [[Evidence-based research]]—metascience
* [[Evidence-based scheduling]]
* [[Evidence-based toxicology]]
* [[Impact evaluation]]
{{colend}}
 
== References ==
{{Reflist}}
* {{cite journal |author=Dale AE |title=Evidence-based practice: compatibility with nursing |journal=Nurs Stand |volume=19 |issue=40 |pages=48–53 |year=2005 |pmid=15977490 |doi=}}
* DiCenso, A., Cullum, N., & Ciliska, D. (1998). ''Implementing evidence-based nursing: some misconceptions''. Evidence Based Nursing, 1, 38-40.
* French, P. (2002). What is the evidence on evidence-based nursing? An epistemiological concern. ''Journal of Advanced Nursing, 37''(3), 250-257.
* Mason, D. J., Leavitt, J. K., & Chaffee, M. W. (2002). ''Policy and politics in nursing and health care'' (4th ed.). St Louis, MO: Saunders/Elsevier.
* Melnyk. B.M. & Fineout-Overholt, E. (2005). ''Making the case for evidence-based practice''. Philadelphia: Lippincott Williams & Wilkins.
* Mitchell, G. (1999). ''Evidence-based practice: critique and alternative view''. Nursing Science Quarterly, 12(1), 30-35.
 
==External Footnotes links==
{{reflist|2commons}}
* {{cite web|url=https://www.ama-assn.org/residents-students/residency/development-evidence-based-medicine-explored-oral-history-video |title= Development of evidence-based medicine explored in oral history video, AMA, JAN 27, 2014|date= 27 January 2014}}
 
{{Evidence-based practice}}
== External links ==
* [http://www.keele.ac.uk/research/pchs/pcmrc/EBP/index.htm Evidence-Based Practice @ Keele]
* [http://www.shef.ac.uk/scharr/ir/def.html Evidence Based Practice Definitions]
* [http://www.joannabriggs.edu.au/about/home.php The Joanna Briggs Institute] - International Collaborative on Evidence-based Practice in Nursing.
* [http://www.ebnp.org Indiana Center for Evidence Based Nursing Practice: A JBI Collaborating Center]
* [http://www.evidence.no/en Evidence] Communicate leading research in order to promote international cooperation and evidence-based treatments
* [http://www.rhrq.gov/clinic/epc Evidence based Practice clinic intitiatives]
* [http://library.umassmed.edu/ebpph Evidence Based Practice studies at the University of Massachusetts]
* [http://nursing.asu.edu/caep/index.htm Center for the Advancement of Evidence-Based Practice (CAEP)] at Arizona State University College of Nursing and Healthcare Innovation
* [http://nnpnetwork.org The National Nursing Practice Network]
 
{{DEFAULTSORT:Evidence-Based Practice}}
[[Category:HealthcareEvidence-based practices| quality]]
[[Category:Health care quality]]
[[Category:Scientific method]]
[[Category:Evidence-based practices]]
 
[[nl:Evidence-based practice]]