EBP 1 / 14 This page offers a starting point for finding information on Evidence Based Practice [EBP]. There are many definitions of EBP with differing emphases. A survey of social work faculty even showed they have different ideas about just what makes up EBP (This can be a source of confusion for students and newcomers to this topic of study. Perhaps the best known is Sackett et al's (1996, 71-72) now dated definition from evidence based medicine: "Evidence based medicine is the conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients. The practice of evidence based medicine means integrating individual clinical expertise with the best available external clinical evidence from systematic research. By individual clinical expertise we mean the proficiency and judgment that individual clinicians acquire through clinical experience and clinical practice. Increased expertise is reflected in many ways, but especially in more effective and efficient diagnosis and in the more thoughtful identification and compassionate use of individual patients' predicaments, rights, and preferences in making clinical decisions about their care. By best available external clinical evidence we mean clinically relevant research, often from the basic sciences of medicine, but especially from patient centered clinical research into the accuracy and precision of diagnostic tests (including the clinical examination), the power of prognostic markers, and the efficacy and safety of therapeutic, rehabilitative, and preventive regimens." This early definition, however, proved to have some important limitations in practice. Haynes et al (2002) - Sackett's colleagues in the McMaster Group of physicians in Canada - pointed out the definition did not pay enough attention to the traditional determinants of clinical decisions. That is, it purposefully emphasized research knowledge but did not equally emphasize the client's needs and situation, nor the client's stated wishes and goals, nor the clinicians' expertise in assessing and integrating all these elements into a plan of intervention. The contemporary definition of EBP is simply "the integration of the best research evidence with clinical expertise and patient values" (Sackett, et al. 2000, p. x). This simpler, current, definition gives equal emphasis to 1) the patient's situation, 2) the patient's goals, values and wishes, 3) the best available research evidence, and 4) the clinical expertise of the practitioner. The difference is that a patient may refuse interventions with strong research support due to differences in beliefs and values. Similarly, the clinician may be aware of factors in the situation (co-occurring disorders, lack of resources, lack of funding, etc.) that indicate interventions with the best research support may not be practical to offer. The clinician may also notice that the best research was done on a population different from the current client, making its relevance questionable, even though its rigor is strong. Such differences may include age, medical conditions, gender, race or culture and many others. This contemporary definition of EBP has been endorsed by many social workers. Gibbs and Gambrill (2002), Mullen and Shlonsky (2004, Rubin (2008), and Drisko and Grady (2012) all apply it in their publications. Social workers often add emphasis to client values and views as a key part of intervention planning. Many social workers also argue that clients should be active participants in intervention planning, not merely recipient's of a summary of "what works" from an "expert" (Drisko & Grady, 2012) Actively involving clients in intervention planning may also be a useful way to enhancing client motivation and to empower clients. Some in social work view EBP as a mix of a) learning what treatments "work" based on the best available research (whether experiential or not), b) discussing client views about the treatment to consider cultural and other differences, and to honor client self determination and autonomy, c) considering the professionals "clinical wisdom" based on work with similar and dissimilar cases that may provide a context for understanding the research evidence, and d) considering what the professional can, and can not, provide fully and ethically (Gambrill, 2003; Gilgun, 2005). With much similarity but some differences, the American Psychological Association (2006, p. 273) defines EBP as "the integration of the best available research with clinical expertise in the context of patient characteristics, culture and preferences." Gilgun (2005) notes that while research is widely discussed,
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
EBP
1 / 14
This page offers a starting point for finding information on Evidence Based Practice [EBP].
There are many definitions of EBP with differing emphases. A survey of social work faculty even
showed they have different ideas about just what makes up EBP (This can be a source of confusion
for students and newcomers to this topic of study. Perhaps the best known is Sackett et al's (1996,
71-72) now dated definition from evidence based medicine: "Evidence based medicine is the
conscientious, explicit, and judicious use of current best evidence in making decisions about the care
of individual patients. The practice of evidence based medicine means integrating individual clinical
expertise with the best available external clinical evidence from systematic research. By individual
clinical expertise we mean the proficiency and judgment that individual clinicians acquire through
clinical experience and clinical practice. Increased expertise is reflected in many ways, but especially
in more effective and efficient diagnosis and in the more thoughtful identification and compassionate
use of individual patients' predicaments, rights, and preferences in making clinical decisions about
their care. By best available external clinical evidence we mean clinically relevant research, often
from the basic sciences of medicine, but especially from patient centered clinical research into the
accuracy and precision of diagnostic tests (including the clinical examination), the power of
prognostic markers, and the efficacy and safety of therapeutic, rehabilitative, and preventive
regimens."
This early definition, however, proved to have some important limitations in practice. Haynes et al
(2002) - Sackett's colleagues in the McMaster Group of physicians in Canada - pointed out the
definition did not pay enough attention to the traditional determinants of clinical decisions. That is, it
purposefully emphasized research knowledge but did not equally emphasize the client's needs and
situation, nor the client's stated wishes and goals, nor the clinicians' expertise in assessing and
integrating all these elements into a plan of intervention.
The contemporary definition of EBP is simply "the integration of the best research evidence with
clinical expertise and patient values" (Sackett, et al. 2000, p. x). This simpler, current, definition
gives equal emphasis to 1) the patient's situation, 2) the patient's goals, values and wishes, 3) the best
available research evidence, and 4) the clinical expertise of the practitioner. The difference is that a
patient may refuse interventions with strong research support due to differences in beliefs and values.
Similarly, the clinician may be aware of factors in the situation (co-occurring disorders, lack of
resources, lack of funding, etc.) that indicate interventions with the best research support may not be
practical to offer. The clinician may also notice that the best research was done on a population
different from the current client, making its relevance questionable, even though its rigor is strong.
Such differences may include age, medical conditions, gender, race or culture and many others.
This contemporary definition of EBP has been endorsed by many social workers. Gibbs and Gambrill
(2002), Mullen and Shlonsky (2004, Rubin (2008), and Drisko and Grady (2012) all apply it in their
publications. Social workers often add emphasis to client values and views as a key part of
intervention planning. Many social workers also argue that clients should be active participants in
intervention planning, not merely recipient's of a summary of "what works" from an "expert" (Drisko
& Grady, 2012) Actively involving clients in intervention planning may also be a useful way to
enhancing client motivation and to empower clients.
Some in social work view EBP as a mix of a) learning what treatments "work" based on the best
available research (whether experiential or not), b) discussing client views about the treatment to
consider cultural and other differences, and to honor client self determination and autonomy, c)
considering the professionals "clinical wisdom" based on work with similar and dissimilar cases that
may provide a context for understanding the research evidence, and d) considering what the
professional can, and can not, provide fully and ethically (Gambrill, 2003; Gilgun, 2005). With much
similarity but some differences, the American Psychological Association (2006, p. 273) defines EBP
as "the integration of the best available research with clinical expertise in the context of patient
characteristics, culture and preferences." Gilgun (2005) notes that while research is widely discussed,
EBP
2 / 14
the meanings of "clinical expertise" and "client values and preferences" have not been widely
discussed and have no common definition.
Drisko & Grady (2012) argue that the EBP practice decision making process defined by Sackett and
colleagues seems to fit poorly with the way health care payers enact EBP at a macro, policy, level.
Clinical social workers point to list of approved treatments that will be funded for specific disorders -
and note this application of EBP does not include specific client values and preferences and ignores
situational clinical expertise. Drisko & Grady point out that there is a conflict between the EBP model
and how it is implemented administratively to save costs in health care. While cost savings are very
important, this use of "EBP" is not consistent with the Sackett model. Further, the criteria used to
develop lists of approved treatments is generally not clear or transparent - or even stated. Payers very
often appear to apply standards that are different from multidisciplinary sources of systematic reviews
of research like the Cochrane Collaboration. Clinical expertise and client values too often drop out of
the administrative applications of EBP.
Evidence based practice is one useful approach to improving the impact of practice in medicine,
psychology, social work, nursing and allied fields. Of course, professions have directed considerable
attention to "evidence" for many years (if not for as long as they have existed!). They have also
honored many different kinds of evidence. EBP advocates put particular emphasis on the results of
large-scale experimental comparisons to document the efficacy of treatments against untreated control
groups, against other treatments, or both. (See, for example, the University of Oxford's Hierarchy of
Evidence for EBM). They do this because well conceptualized and completed experiments (also
called RCTs) are a great way to show a treatment caused a specific change. The ability to make cause
and effect determinations is the great strength of experiments. Note that this frames "evidence" in a
very specific and delimited manner. Scholars in social work and other professions have argued for
"Many Ways of Knowing" (Hartman, 1990). They seek to honor the knowledge developed by many
different kinds of research - and to remind clinicians, researchers and the public that the
conceptualization underlying research may be too narrow and limited. Thus Drisko & Grady (2012)
argue that EBP, as summarized by researchers, may devalue non-experimental research. Experiments
are only as good as the exploratory research that discovers new concepts, and the descriptive research
that helps in the development of tests and measures. Only emphasizing experiments ignores the very
premises on which they rest. Finally, note that EBM/EBP hierarchies of research evidence include
many non-experimental forms of research since experiments for some populations may be unethical
or impractical - or simply don't address the kinds of knowledge needed in practice.
All the "underpinnings" of experimental research: the quality of conceptualizations, the quality of
measures, the clarity and specificity of treatments used, the quality of samples studied and of the
quality and completeness of collected data are assumed to be sound and fully adequate when used to
determine "what works." There is also an assumption that the questions framing the research allow
for critical perspectives and are fully ethical. Social workers would argue they should also include
social diversity samples well - since diverse kinds of people show up at real world clinics.
International standards affirm basic ethical principles supporting respect for persons, beneficence and
social justice (see The Belmont Report.)
Is EBP only about Intervention or Treatment Planning?
No. This may be the most common application of EBP for clinical social workers, but the EBP
process can also be applied to a) making choices about diagnostic tests and protocols to insure
thorough and accurate diagnosis), b) selecting preventive or harm-reduction interventions or
programs, c) determining the etiology of a disorder or illness, d) determining the course or
progression of a disorder or illness, e) determining the prevalence of symptoms as part of establishing
or refining diagnostic criteria, f) completing economic decision-making about medical and social
service programs (University of Oxford Centre for Evidence-based Medicine, 2011), and even g)
understanding how a client experiences a problem or disorder (Rubin, 2008).
Note that there are a growing number of commercial [.com] sites that offer their consultation
regarding EBP. It is not always easy to determine their organization structure and purposes, the basis
of their recommendations and any potential conflicts of interest. In this regard, the sites of the
government and of professional organizations are "better" resources as their purposes, missions and
funding sources are generally more clear and publicly stated.
References:
American Psychological Association. (2006). APA presidential task force on evidence based practice. Washington, DC: Author
Dobson, K., & Craig, K. (1998). Empirically supported therapies: Best practice in professional psychology. Thousand Oaks, CA: Sage.
Drisko, J. & Grady, M. (2012). Evidence-based practice in clinical social work. New York: Springer-Verlag.
Elwood, J.M. (2007). Critical appraisal of epidemiological studies and clinical trials (3rd ed.) New York: Oxford University Press.
Gambrill, E. (2003). Evidence-based practice: Implications for knowledge development and use in social work. In A. Rosen & E. Proctor (Eds.), Developing practice guidelines for social work intervention (pp. 37-58). New York: Columbia University Press.
Gibbs, L. (2003). Evidence-based practice for the helping professions. New York: Wadsworth.
Gilgun, J. (2005). The four cornerstones of qualitative research. Qualitative Health Research, 16(3), 436-443.
Howard, M., McMillen, C., & Pollio, D. (2003). Teaching evidence-based practice: Toward a new paradigm for social work education.
Research on Social Work Practice, 13, 234-259.
Mace, C., Moorey, S., & Roberts, B. (Eds.). (2001). Evidence in the psychological therapies: A critical guide for practitioners.
Philadelphia, PA: Taylor & Francis.
Mantzoukas, S. (2008). A review of evidence-based practice, nursing research and reflection: Levelling the hierarchy. Journal of Clinical
Nursing, 17(2), 214-223.
Roberts, A., & Yeager, K. (Eds.). (2004). Evidence-based practice manual: Research and outcome measures in health and human services.
New York: Oxford University Press.
Sackett, D., Rosenberg, W., Muir Gray, J., Haynes, R. Richardson, W. (1996). Evidencebased medicine: what it is and what it isn't. British
Medical Journal, 312, 71-72. http://cebm.jr2.ox.ac.uk/ebmisisnt.html
Sackett, D., Richardson, W., Rosenberg, W., & Haynes, R. (1997). Evidence-based medicine: How to practice and teach EBM. New
York: Churchill Livingstone.
Simpson, G., Segall, A., & Williams, J. (2007). Social work education and clinical learning: Reply to Goldstein and Thyer. Clinical Social
Work Journal, (35), 33-36.
Smith, S., Daunic, A., & taylor, G. (2007). Treatment fidelity in applied educational research: Expanding the adoption and application of
measures to ensure evidence-based practice. Education & Treatment of Children, 30(4), pp. 121-134.
Stout, C., & Hayes, R. (Eds.). (2005). The evidence-based practice: Methods, models, and tools for mental health professionals. Hoboken,
NJ: Wiley.
Stuart, R., & Lilienfeld, S. (2007). The evidence missing from evidence-based practice. American Psychologist, 62(6), pp. 615-616.
Trinder, L., & Reynolds, S. (2000). Evidence-based practice: A critical appraisal. New York: Blackwell.
Wampold, B. (2007). Psychotherapy: The humanistic (and effective) treatment. American Psychologist, 62(8), pp. 857-873.
EBP
6 / 14
WHAT IS EBP
Evidence based practice (EBP) might best be viewed as an ideology or a "public idea" in Robert
Reich's (1990) terminology. The movement began with the work of Scottish physician Archie
Cochrane who sought to identify "treatments that work" using the results of experimental research.
Another important aspect of Cochrane's interest was to identify and end treatments that do harm or are
not effective. In practice, the idea was to supplement professional decision making with the latest
research knowledge. [It is worth noting that some critics might argue "replace" professional decision
making might even be applicable.] The goal was to enhance the scientific base of professional
practice in several disciplines - medicine, nursing, psychology, social work, etc. In turn, educational
efforts in these disciplines could be oriented to provide beginning professionals with effective tools
and a model for the continuing improvement and renewal of their professional practices.
EBP as described here is not the same as an empirically supported treatment [EST]. ESTs are
variously defined, but are basically treatments or services that have been empirically studied and
found to be helpful or effective - in one or more settings. The difference is that ESTs may be
treatments or services that are not fully replicable in other settings and most have not been studied in
multiple contexts. ESTs may not have been replicated - tested more than once - to insure the result
will be the same or similar in another setting. The efforts of the Cochrane Collaboration include
setting rigorous standards for examining the methods by which outcome research is done as well as
reporting research results. This is to insure transparency in their findings [that others can know
exactly how the conclusions were drawn] and not to exaggerate the results of a single test of any
treatment.
Why EBP? Some Say Any Other Approach Is Unethical
Some advocates argue that to treat anyone using treatments without known efficacy is unethical. That
is, if we know a given medicine or substance abuse program or treatment for attachment problems
works better than another treatment, it is an ethical obligation to use it in order to best serve clients or
patients. This is an argument that is hard to challenge - at least in an ideal world. Given strong and
unambiguous research evidence that is clearly useful to a given practice situation, and consistent with
the client's world view and values, using EBP "best treatment" is the best way to go.
Policy and Funding Issues
In social work and psychology, advocates have also argued that only interventions with demonstrated
efficacy should be supported financially. Such an argument links demonstrations of efficacy with the
funding structure of the current managed care environment. It may be seen as either a way to best use
limited dollars or yet another method to curtail funding for costly services. Without provision of
adequate funds to do thorough research on the great variety of treatments in use, the requirement of
proven efficacy may be used as a tool to limit treatment services.
Assessing EBP -- Some Key Issues
In psychology, the initial unveiling of "empirically validated treatments" by an American
Psychological Association Task Force brought forth both interest and criticism. It also brought out
differences regarding interpretations of the existing research literature and regarding the merits of
certain research methods. One key concern was the over-reliance on randomized control trials
[RCTs]. An RCT is an experiment in which participants are randomly assigned to either a treatment
or a control group. Ideally, neither participant or treating clinician knows which group is which. After
a course of treatment (or control), improvement is determined by comparing pre-treatment status with
post-treatment status. If the treated group improves significantly more that the controls, we can say
the treatment caused the change and that the treatment works (better than no treatment). In another