Page 1
Research Use in Education: An Online Survey of School
Practitioners
Larysa V. Lysenko
Concordia University
Philip C. Abrami Concordia University
Robert M. Bernard Concordia University
Christian Dagenais Université de Montréal
Abstract
This paper summarizes the results of a pan-Canadian online survey study that investigates the
extent to which school practitioners (N = 1,153) use research to inform their practice. The
self-reports indicate that the majority of the respondents used educational research, yet this
engagement was infrequent. Although the respondents shared neutral attitudes about research,
their comments add rather a negative connotation to their perceptions. This study’s findings are
relevant to school leadership organizations, teacher education institutions, and
research-generating bodies as they point to the necessity of increasing research relevance and
accessibility, cultivating teaching as a research-based profession, and building school capacity
to use research.
Keywords: online survey, self-reported attitudes and behaviours, use of research-based
information, factors, school practice
Larysa Lysenko, Ph.D. is a professional researcher at the Centre for the Study of Learning and Performance,
Department of Education, Concordia University, 1211 St. Mathieu, Montreal, Quebec, Canada, H3H 2S2
E-mail: [email protected]
Philip C. Abrami, Ph.D. is a Professor, University Research Chair, and the Director of the Centre for the Study of
Learning and Performance, Department of Education, Concordia University, 1211 St. Mathieu, Montreal, Quebec,
Canada, H3H 2S2
E-mail: [email protected]
Robert M. Bernard, Ph.D. is Professor of Education and Leader of the Systematic Review Sub-Theme at the Centre
for the Study of Learning and Performance, Concordia University, S-FG 6207 Faubourg Ste-Catherine Building,
1610 St. Catherine W., Montreal, Quebec, Canada, H3H 2S2
E-mail: [email protected]
Christian Dagenais, Ph.D. is Associate Professor of Psychology at the University of Montreal, Pavillon
Marie-Victorin, 90 Vincent d'Indy avenue, Montreal, Quebec, Canada, H3C 3J7
E-mail: [email protected]
Page 2
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
36
Brock Education Journal, 25 (1), Fall 2015
Introduction
In the past decades, research on educational issues and practices has generated a rich knowledge
base that could be used to improve classroom teaching. However, there has been a persistent lack
of a systematic application of the research knowledge to teaching practice (e.g., Organization of
Economic Co-operation and Development publications, 2007; 2010). While early studies focused
on the use of educational research by practitioners, there has been a recent and noticeable shift
towards examining research use as a system property, meaning a coordinated series of actions by
organizations to produce, transform, and implement research knowledge to improve both
teaching practice and learning outcomes. To emphasize the systemic nature of research
knowledge utilization and the multiplicity of agents involved, Sharples (2013) introduced the
term “knowledge mobilization ecosystem,” in which the process of transforming research
evidence into actionable knowledge for use is the critical component of the system.
However, this focus on creating and supporting an effective knowledge mobilization
system has overshadowed efforts to examine and improve knowledge use by individual
practitioners. For instance, Levin (2013) argues: “It is unreasonable and even undesirable to
expect individual teachers or principals to be the key consumers of research findings” (p.18).
Despite all the importance accorded to research use as an organizational process, it should not be
forgotten that the individual teacher remains an agent of change whose actions makes the
difference in the classroom.
Conceptual Framework
The literature suggests that research use is an interpretative process and is affected by more than
research information alone. Walter, Nutley, and Davies (2005) suggest that individual
characteristics including perceptions, experiences, skills, and social contexts such as
organizational settings and broader environments also shape the ways research knowledge
trickles down to individual practice. Coburn, Honig, and Stein (2009) specify that “individuals
and groups make meaning of evidence in ways that are profoundly shaped by their pre-existing
beliefs and practices and day-to-day limits” (p. 86).
Elsewhere Dagenais et al. (2012) have reviewed the empirical evidence regarding
potential factors affecting practitioners’ use of research evidence. They view research use as a
multidimensional construct and draw on the distinction that literature makes between
instrumental, conceptual, and symbolic uses. Instrumental use is concerned with the change to
concrete practices where research findings are transmitted and applied intact (e.g., Knott &
Wildavsky, 1980). Conceptual (Neilson, 2001) or “enlightenment” use (Weiss, 1980) represents
a change in understanding or thinking about an issue that affect practice indirectly. Symbolic or
strategic use (e.g., Hughes, McNeish, Newman, Roberts, & Sachdev, 2000) relates to using
research findings to influence decisions, justify actions, or support a decision that has already
been made. Estabrooks (1999) calls for this persuasive use, emphasizing that such use can serve
the constructive purpose of legitimizing a position or practice. Although research use may occur
for one specific purpose or simultaneously for different ends, conceptual use is argued to prevail
over and precede the other two (e.g., Sunesson & Nilsson, 1988).
Relying on the Roger’s innovation diffusion framework (2003), Dagenais et al. (2012)
group the potential factors influencing practitioner use of research knowledge. At the individual
Page 3
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
37
Brock Education Journal, 25 (1), Fall 2015
level, they identify (1) qualities of research findings as perceived by the practitioner (opinions)
and (2) the practitioner’s capacity to use this information for a variety of ends (expertise). At the
school level, the factors include (3) the school context and culture (organizational factors) that
impact practitioners’ openness for learning and engagement with research generated in academia
or locally in school-based projects. Finally, the authors identify (4) communication and
interaction between the levels of the system and its actors (awareness activities) as a factor
affecting visibility and consequently practical utility of research findings. The researchers argue
that the power of these four factors to determine and predict practitioners’ engagement with
research must be taken into account in future studies. In fact, a recent empirical study of research
use found that school practitioners’ attitudes toward research, and their capacity to engage with
it, had the most predictive power on the low and infrequent self-reported use of research
(Lysenko, Abrami, Bernard, Dagenais & Janosz, 2014). This paper, which takes educational
practitioners as its focus, offers an update on their practitioners’ propensity to inform classroom
practice with the findings from educational research through the school practitioners’
self-reported attitudes and behaviours, gathered by an online, pan-Canadian survey.
Method
This study is an extension of an earlier project (Lysenko et al., 2014), which was conducted in
the narrower context of secondary schools located in disadvantaged areas in a province of
Eastern Canada. For this study, the context was broadened to include respondents across Canada
teaching in both elementary and secondary schools. The data collection also included the
self-reports to open-ended survey questions that were added to the original instrument. While the
primary purpose of the study was the validation of the instrument in a broader context, this paper
reports the substantive findings with a focus on the qualitative data provided by practitioners.
Instrument
Data collection was done through the “Questionnaire about the Use of Research-based
Information” (QURBI), (Dagenais, Janosz, Abrami, Bernard, & Lysenko, 2008; Lysenko et al.,
2014), originally developed to assess factors that affect the extent of research use by educational
practitioners. Figure 1 represents the iterative development and refinement of this instrument
including pilot testing and large-scale validation.
Page 4
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
38
Brock Education Journal, 25 (1), Fall 2015
Focus group
Literature review
Team discussionsFocus groups
Interviews
Instruments analysis
Government
educational official
Pilot (n=105)
Empirical refinement
Validation (n=2425)
Figure 1. Stages of QURBI development.
The questionnaire was validated with 2,425 secondary school teachers. It demonstrated
high internal consistency of 0.92. The obtained four-factor structure reflected the conceptual
framework and accounted for 17% of the variance in self-reported research use, as well as 22%
of three dimensions of use.
The instrument consists of 43 close-ended items. The first set of outcome variables asked
about the use of various sources of research-based information (RBI) which was defined as a
free-circulating commodity produced by professional researchers such as research teams from
universities, external evaluation teams, or by practitioners conducting research in collaboration
with researchers produced locally and intended for local use. Specifically, practitioners were
asked to report on how frequently in the past year they used research-based information from
different sources, such as scholarly documents, professional publications, and websites. The
second set of outcome variables examined the dimensions of the use of research-based
information in the past year. Instrumental use was captured through items, such as “to improve
professional practice” and “to develop new activities, programs.” Conceptual use was measured
by the following items: “to achieve a better understanding of practical issues” and “to reflect on
one’s attitudes and practices.” Symbolic use was examined through items related to confirming
practices or actions.
Page 5
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
39
Brock Education Journal, 25 (1), Fall 2015
The factor variables (26 items) were provisionally organized into four sections:
1. Opinions about research — focused on the relevance, timeliness, and reliability of
research-based information, as well as its usefulness, ease of access and
understanding, and transferability to practice.
2. Individual expertise — examined the importance practitioners attached to the
skills needed to access, appraise, and translate RBI to their practice.
3. Awareness activities — referred to practitioners’ involvement in research, contact
with researchers and knowledge brokers, as well as the way research findings are
presented to practitioners.
4. Organizational factors — focused on school culture (supportive environment, the
importance of professional development, opportunities to challenge habits and
traditions), available resources (qualified staff, facilities and technology, time,
incentives), and external influences on individual practices (organized groups,
such as unions and granting agencies).
Each section of the questionnaire was followed by an open-ended question to elicit practitioners’
comments. Additionally, the participants were encouraged to provide demographic data
including gender, teaching experience, education level, occupation category, and grade level.
School information including geographical location, size, and language of teaching was also
gathered. Finally, the survey elicited reports of previous engagement in research such as
coursework in research methods or participation in research projects.
Survey Distribution
The English and French versions of the survey were hosted and distributed through the Canadian
Council on Learning’s online assessment platform. Provincial and federal teachers’ unions were
contacted to solicit practitioners to respond to the survey. Non-profit educational organizations
and provincial teacher colleges were also asked for assistance. While the recruiting strategies
were left to the discretion of these organizations, the majority contacted members electronically
—for example, through electronic newsletters, e-mails, and websites.
Analyses
Standard SPSS procedures were applied to screen data from the 1,611 surveys registered in the
online assessment system. After deleting the cases where 60% or more responses were missing,
the remaining surveys had missing data imputed by expectation maximization. No univariate
outliers were detected. All multivariate outliers were removed, leaving 1,153 cases for analyses
that were completed using SPSS for Windows. To compensate for the unbalanced sample, the
mean scores were weighted. Answers to open-ended questions were analysed using NVivo 8.
Categories of notions with similar meaning and connotations were developed, relying on both a
priori and a posteriori approaches (Leech & Onwuegbuzie, 2011). A Chi-square test of
independence was performed for a potential association between the count of categories
emerging from practitioners’ comments and the frequency with which they used research-based
information.
Page 6
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
40
Brock Education Journal, 25 (1), Fall 2015
Results
Respondents
Teachers were the largest category of respondents (82.1%) whereas school administrators and
professional staff represented 8.7% each. Teachers had on average 12 years of teaching
experience. Most of the teachers (74%) and professionals (60%) held an undergraduate degree
while 54.4% of the administrators had a master’s degree. Only 1.3% of respondents had a
pre-university education, an undergraduate certificate, or a doctorate degree. In terms of training
and involvement in research, 56.3% of respondents reported having taken some coursework in
research methods and 57.8% of them indicated they had participated in a research project of
some kind. Only 29.3% reported having taken part in school-based research projects.
Additionally, the majority of respondents (96.3%) reported being part of the public school
system with 33% working in secondary and 77% in primary schools. Only 3.9% reported French
as their working language. Forty-six percent of respondents worked in medium size schools
(150-500 students). Almost 90% of them worked in schools located in suburban and rural areas.
As Table 1 shows, the respondents were distributed across the provinces and territories.
It was Ontario practitioners who accounted for 50% of respondents, with the rest distributed
across the provinces and territories.
Table 1
Distribution of Respondents by Province and Category of Employment
Teachers Administrators Professionals Unknown Total
Alberta
35
17
4
-
56
British Columbia 131 1 10 - 142
Manitoba 12 6 1 - 19
New Brunswick 142 19 13 - 174
Newfoundland and Labrador 4 8 - - 12
Northwest Territories - - 1 - 1
Nova Scotia 16 2 - - 18
Nunavut - 2 - - 2
Ontario 521 31 51 - 603
Prince Edward Island 5 0 2 - 7
Quebec 43 5 17 - 65
Saskatchewan 7 5 1 - 13
Yukon 1 1 - - 2
Unknown 30 4 1 4 39
Total 947 101 101 4 1,153
Page 7
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
41
Brock Education Journal, 25 (1), Fall 2015
QURBI Close-ended Questions
The psychometric statistics obtained in this study corroborated the earlier findings of the survey
(Lysenko et al., 2014). Internal consistency reliability coefficients (Cronbach’s alpha) were
acceptable ranging from 0.77 to 0.92 on the sub-scales and 0.94 on the overall questionnaire.
Reflecting the conceptual structure provisionally imposed on the questionnaire, the four-factor
solution accounted for 61% of variance whereas each factor explained from 39% to 4.6%.
Survey responses show that each source of research-based information (RBI) was used at
least once in the past year (Figure 2). The majority of respondents (85%) obtained
research-based information from resource personnel, whereas only 58% referred to pre-service
training to inform their practice.
Figure 2. Percent of respondents who used RBI at least once for the past year.
The mean scores summarized in Table 2 reveal that respondents were not particularly avid users
of sources that would potentially contain research evidence. Respondents reported having used
research-based information on average “once or twice during the past year.” However, some
amount of variation is noticeable in the reports from all ten sources. Respondents reported using
online sources (e.g., websites) most frequently whereas pre-service training was rated as the least
used. Interestingly, school evaluations were rated the lowest third.
Page 8
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
42
Brock Education Journal, 25 (1), Fall 2015
Table 2
QURBI Weighted Means and Standard Deviations
1. Rate the frequency with which
you have used the RBI from the
following sources during the last
year: 0 – never; 3 – five or more
times
Means (SD)
Scholarly documents 1.58(1.08)
Professional publications 1.53(1.02)
School evaluations 1.12(0.99)
Internet, web-sites 1.61(1.13)
Multimedia: video, DVD 1.13(1.02)
Mass media: TV, radio,
newspapers 1.09(1.04)
Pre-service training 0.95(1.01)
In-service training, workshops 1.41(0.99)
Professional conferences,
presentations 1.26(0.91)
Experts, resource people 1.43(0.95)
Use of RBI composite 1.31(0.67)
2. Rate the frequency with which you have used RBI during the last year…
0 – never; 3 – always
To achieve a better
understanding of issues in your
practice
1.53(0.77)
To satisfy intellectual curiosity 1.60(0.75)
To improve professional practice 1.66(0.75)
To reflect on your attitudes and
practices 1.47(0.76)
To justify or validate your
decisions 1.39(0.77)
To resolve problems in your
daily practice 1.22(0.73)
To develop new activities,
programs, guidelines 1.57(0.74)
Conceptual use composite 1.53(0.67)
Instrumental use composite 1.48(0.63)
Symbolic use
1.39(0.77)
Page 9
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
43
Brock Education Journal, 25 (1), Fall 2015
3. Rate the extent to which you agree on RBI…
1 – strongly disagree, 3 – neutral, 5 – strongly agree
Is easy to find 3.12(0.93)
Is easy to understand 3.05(0.87)
Is relevant to your reality 3.39(0.98)
Offers timely information 3.15(0.91)
Is reliable and trustworthy 3.29(0.82)
Is useful to guide or improve
your professional practice 3.49(0.96)
Is easy to transfer into your
practice 3.08(0.95)
4. Rate the extent to which you agree that the following activities are useful to make you aware
of RBI… 1 – strongly disagree, 3 – neutral, 5 – strongly agree
Presentation of research findings
tailored to your needs 3.43(0.94)
Your involvement in a research
project 3.54(0.95)
Research results accompanied by
clear and explicit
recommendations
3.82(0.87)
Opportunities to discuss research
results with the research team 3.83(0.84)
Regular contacts with people
who distribute research-based
information
3.53(0.93)
Demonstrations about how to
apply research recommendations 3.78(0.91)
Discussions of research-based
information with colleagues 3.56(0.91)
5. Rate the extent to which you agree that the following skills are necessary in everyday
practice… 1 – strongly disagree, 3 – neutral, 5 – strongly agree
Ability to read and understand
research publications 3.91(0.86)
Skills to use IT, such as Internet
and databases 4.23(0.79)
Ability to assess the quality of
research-based information 3.82(0.89)
Expertise to translate research
findings to practice
3.80(0.92)
Page 10
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
44
Brock Education Journal, 25 (1), Fall 2015
6. Rate the extent to which you agree that the following organizational factors influence use of
RBI…1 – strongly disagree, 3 – neutral, 5 – strongly agree
Available time to read a journal,
apply a new technique, etc. 4.20(0.83)
Available facilities and
technology
3.90(0.89)
Incentives, such as remuneration,
honoraria, and lessening the
workload
3.31(1.13)
Opportunities to challenge
established habits and traditions 3.66(0.89)
Organizational importance for
professional development 3.71(0.92)
A supportive environment 3.91(0.86)
Human resources, such as the
availability of qualified staff 3.84(0.92)
Organized groups, such as
unions, granting agencies, media 2.98(1.02)
In respect to the ends to which practitioners reported having used RBI, there is a split in
their reports on the instrumental use of RBI. Improving professional practices was most
frequently reported whereas resolving issues in everyday practice was the least. However, the
dispersion of scores on this scale was fairly small. Although mean scores of the observed factor
variables gravitate around the point of neutrality, between-factor and within-factor variations in
mean scores should be noted. For instance, items relating to research appraisal skills were
uniformly rated high. Among these skills, online search skills and the ability to read and
understand research were the most needed. On the contrary, the respondents were most
reserved in their opinions about RBI. Their neutrality is noticeable for the following two items:
“research is easy to understand” and “research is easy to transfer to one’s practice.” Considerable
disparity in mean scores was observed for the variables pertaining to practice constraints where
availability of time to read and apply research ranked as the most influential and the impact of
organized groups (such as unions) on practitioners’ decision to implement research ranked the
least.
Correlation analysis examined the association between the subscales of the questionnaire
and demographic variables. Because the use scores were low and not highly variable, the
following composite scores were calculated:
a) use of the sources of research-based information,
b) conceptual use,
c) instrumental use, and
d) symbolic use (Root item was used for symbolic use).
Factor scores represented the four factor subscales including:
a) practitioners’ opinions about RBI,
Page 11
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
45
Brock Education Journal, 25 (1), Fall 2015
b) awareness activities,
c) expertise, and
d) organizational factors.
Summarized in Table 3, the coefficients show that variables of job responsibility,
coursework in research methods, and participation in research projects positively and
significantly correlated with the four composites of self-reported use of research-based
information. So did the three factors but not the organizational factors.
Table 3
Correlations Between QURBI and Demographic Variables
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
1. Use of RBI
sources
2. Conceptual
use 0.66*
**
3. Instrumental use
0.68*
** 0.83*
**
4. Symbolic use 0.53*
** 0.71*
** 0.71*
**
5. Gender 0.04 0.05 0.05 0.06
6. Experience -0.01 0.05 -0.05 0.01 -.010*
*
7. Responsibility 0.15*
* 0.16*
** 0.13*
** 0.14*
** -0.11*
* 0.13
**
8. Coursework 0.21*
* 0.21*
* 0.17*
* 0.19*
* 0.00 -0.04 0.08
9. Participation in research
0.32*
** 0.34*
** 0.30*
** 0.26*
** -0.02 0.14
** 0.14*
* 0.30*
**
10. Grade -0.03 -0.04 -0.10*
-0.10*
-0.24*
** 0.03 0.06 0.03 0.05
11. School size 0.04 0.02 0.01 0.04 -0.11*
* 0.03 0.00 0.00 0.03 0.33*
**
12. School location
0.06 0.05 0.03 0.04 -0.05 0.01 -0.11**
0.00 0.10*
* -0.01 0.25*
**
13. Language of
teaching -.004 0.01 0.05 0.03 0.07 0.02 -0.09
* -0.01 0.00 -0.09
* 0.05 0.06
14. Opinions
about RBI 0.35*
** 0.43*
** 0.46*
** 0.37*
** 0.01 0.06 0.11*
* 0.07 0.20*
** -0.04 0.00 0.04 0.01
15. Awareness
activities 0.20*
* 0.26*
* 0.22*
* 0.19*
* 0.05 0.00 0.19*
* 0.02 0.13*
* -0.05 -0.02 0.04 -0.02 0.00
16. Expertise 0.16*
* 0.17*
** 0.13*
* 0.15*
* 0.04 0.02 0.04 0.15*
* 0.16*
* 0.00 -0.04 0.03 -0.02 0.00
0.0
0
17. Organizational
factors
0.02 0.01 0.01 0.05 0.00 0.07 -0.01 -0.04 -0.02 -0.03 0.00 -0.14*
0.01 0.00 0.0
0
0.0
0
**p < .01, ***p< .001
Page 12
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
46
Brock Education Journal, 25 (1), Fall 2015
Gender and educational experience as well as all school properties, did not correlate with
any of them. Administrators tend to report more involvement in research projects and give higher
opinions about research and the importance of awareness activities for their practice than
teachers and professionals do. Practitioners who took coursework in research methods reported
higher participation in research projects as well as giving more weight to research-related skills.
Prior participation in research is also positively related to practitioners’ opinions about
research-based information, activities to raise their awareness of research findings, and the
importance they give to research-related skills for their practice.
QURBI Open-ended Questions
The thoughts shared by practitioners through the six open-ended questions represent an
important part of this study’s data collection. Although many respondents skipped the
open-ended questions, nonetheless 504 practitioners commented in some form on at least one of
them. The number of comments ranged from 362 (list sources of RBI) to 114 (dimensions of
use), with the average being 208 comments per question. The majority of commenters (84.1%)
were teachers. It appeared that twice or sometimes three times more comments came from the
respondents who reported having used RBI three times or more over the past year. However, we
failed to find any significant relationship between the categories that emerged from the
comments and frequency of self-reported use, implying that the pattern of individual preferences
is likely to be independent of the self-reported behaviour of use of RBI.
In their comments about the sources of research-based information, the majority of 362
respondents (91.7%) reported having used RBI from a blend of sources. According to them, it is
challenging to choose the source truly based on the research findings as some sources lack a
research base. For instance, referring to professional development workshops “resource teachers
…do not back up the methods with research. The approach turns out to be ‘evangelical’, i.e.,
supporting the latest fad with a lot of fervour, rather than presenting new ideas that are
research-based.” Collegial exchanges were viewed as a primary conduit of information: “I
believe that a teacher is more likely to implement a new practice if it comes from a colleague
who has tried it.” The respondents rated colleagues the top reference source (55.8%). The
preference given to such a casual information flow contrasted with the relatively little
importance attached to centralized, formal communication provided by the federal and provincial
governments (12.7%), professional unions and associations (6.63%), and school boards (10.5%).
Respondents reported using books (48%) more than academic journals (11.1%). In particular,
Diller’s writing on literacy workstations, Caulkin’s works on primary writing, and van de
Walle’s student-centred mathematics were referred to frequently. Internet was emphasized as an
increasingly popular point of access to research findings by the respondents (35.1%). Only 3.6%
participants reported having used assessment results or information from university programs.
Only 0.8% consulted research databases.
Although only 114 practitioners’ out of 362 commented on “dimensions of use,” 76.3%
of them reported having used research findings instrumentally. This use targeted students and
classrooms, school and professional communities, parents and teachers themselves. Practitioners
used RBI as a classroom tool for tasks that included helping special needs students, classroom
management, increasing student motivation, evaluating student performance, and empowering
students in their learning. Conceptual use was reported by 35.9% mainly as a way to support
Page 13
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
47
Brock Education Journal, 25 (1), Fall 2015
practitioners in setting goals and developing expectations as well as to reflect, make decisions,
develop a personal philosophy of teaching, and validate their practice. Reports of symbolic use
of research to legitimize actions without necessarily changing practices or taking action were few
(4.4%). They pertained to justifying assessment results and curricular decisions as well as
persuading parents about a particular course of action taken.
The opinions (N=256) that emerged about RBI were quite critical of the information
itself, with criticisms targeting the irrelevance of the research (38.7%). Research was
characterized as “Ivory Tower” conducted for “perfect students with no problems, no personal
issues” and “never factoring in the different environments, cultures, and socio-economic
conditions kids come from.” Some respondents complained that the spectrum of RBI is too
broad, making it impossible to relate it consistently to classroom situations. Others judged RBI
as “narrowly focused” and ignoring “possible or combined reasons for the results.” The
disconnect between research and practice was accounted for by a number of reasons.
Specifically, researchers were described as those who never taught or did it so long ago that
“they lost touch with the realities of the classroom” whereas research findings as those that often
did not relate to the classroom reality, “the big ideas from the ministry.” Furthermore,
respondents commented that too much research was driven by U.S. trends, “extrapolating
American findings on Canadian practices.” A few respondents suggested that teachers’
involvement in research would ensure its relevance; for instance, “partnerships involving
practitioners in practical research should be favoured.”
The methodological quality of educational research was also criticised, including its lack
of rigour and unreliable findings. Additionally, reader “unfriendliness” of the research reports
(16.8%) “overburdened with catch phrases and keywords that hide the true impact” and a lack of
their accessibility for “for quick reading,” were another points of criticism. The respondents
would need short and clear reports incorporating “actual examples” as well as summaries and
syntheses with “a ballpark figure to gauge effectiveness.” Also clear, strategic, and
comprehensive recommendations should be provided to guide practice. Some reported that a
limited ability to understand research, especially statistical data, might account not only for the
negative perception of RBI, but also for poor utilization and misuse of the information.
Practitioners suggested that pre-service and in-service education should be venues to develop the
skills to read, interpret, and relate research results to teaching practice.
The physical inaccessibility of research data also discourages practitioners from using it
in their practice (12.9%). Many practitioners do not have access to online libraries and databases.
Those who have access reported having some contact with universities, being university
students, or maintaining contact through their previous jobs. Practitioners living in remote
communities do not have adequate access. Some practitioners must rely on “hard copies of the
out-dated research papers distributed at school and school board” as their only sources of RBI. In
this respect, timeliness of research are also a concern.
Few respondents (7%) produced balanced appraisals of RBI. For instance, RBI is
described as “sometimes far-fetched, sometimes relevant and credible.” According to them,
practitioners’ opinions about RBI “depend upon the research and who did it.”
Even if RBI is accessed, read, understood, and found relevant, there are a number of
practical challenges to its systematic use in the classroom (12.5%). Lack of support and
resources, large classes, lack of time to plan for changes and to modify the information to meet
students’ needs, and the mere stress of the day “often cause practitioners to lapse into a more
Page 14
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
48
Brock Education Journal, 25 (1), Fall 2015
traditional pedagogical method because it offers the path of least resistance, not because it is best
practice.”
Comments on “activities raising awareness” of research findings (N=192) emphasized the
importance of demonstrating how the findings of a particular study can be used in classroom
practice (26.7%). To be useful, these demonstrations should reflect classroom reality and show
how findings can be incorporated into practice seamlessly. Practitioners reported they would also
appreciate lesson plans and assessment tools based on research findings. These plans and tools
could be incorporated into workshops and seminars delivered in person or as video clips via the
Internet. For instance, 20% of the respondents suggested an open-access meta-resource, “access
point,” “tool box,” “resource bank,” or “central list of research,” consolidating summarized or
synthesized applied educational research. As a longed-for tool to improve school practice, this
resource would be accompanied by clear recommendations and demonstrations and structured by
subject or problem areas.
Comments on expertise needed to use RBI (N=143) ranged from skills and individual
dispositions to external support and expertise. A key set of abilities included information search
skills (9.6%), research appraisal skills (20.6%), expertise to relate and adapt research results to
practical context (20.6%), abilities to summarize and communicate available research findings
(6.2%), and abilities to conduct one’s own research (6.2%). The importance of external support
was mentioned by 27% of respondents. They need the presence of a “more knowledgeable other”
(expert) who can work well with teachers and other practitioners to discuss and explain research,
to summarize and translate ideas that are relevant to teachers, to demonstrate the usefulness of
findings in a classroom setting, and to provide tools and resources to help apply RBI in a
particular situation.
Among the reported “organizational factors” (N=177), lack of time was an important
barrier to practitioners’ decision to use RBI (26.4%). Time is needed to search out sources and
then read, understand, reflect, and incorporate ideas into practice. Some suggest that such time
should be formally allocated in job contracts or assigned by school administrations or school
boards. Others propose time-saving strategies and demand ready-made materials: “Just provide
the data and the appropriate lesson plan, and I am in.” Others feel that the using research is part
of their obligation to students: “I can’t wait until the powers decide when it will be the time. Nor
can I use the excuse of time, or other reasons not to explore my teaching.”
Support and encouragement were reported as another critical need (52%). School
practitioners count mainly on school administrations for support (27.7%) as they “can allow
time,” “encourage team meetings,” and “promote a safe working environment for research
inquiry.” However, the use of RBI should become a priority for school leaders: “Emphasis and
priority on research-based information need to start with school administration to effectively
filter down to teachers.” There is possibly only a fine line separating engaging and encouraging
initiatives from authoritative force and pressure. As a result, some practitioners develop
resistance to the top-down approach: “Research that I am told to use makes me resist . . . I use
research when I feel it will be useful to me, not when someone else pressures me.”
As well, proactive leadership needs to be combined with the effort of the school
community to contribute to a spirit of research-based school practices. Respondents mentioned a
“critical mass of people” who could unite colleagues doing research or who are involved in
successful implementation of research findings (13%); support from school boards (11.3%),
professional unions and governments (10.7%), and parental support (3.9%).
Page 15
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
49
Brock Education Journal, 25 (1), Fall 2015
A number of respondents (12.4%) also mentioned that political and ideological agendas
pursued by governing bodies at various levels do not usually take proper account of RBI. Some
reported conflicts that emerged when educational groups and organizations pushed information
based on their priorities, and ignored or hampered practitioners’ initiatives to use research
findings relevant to their teaching practice. It is even more complicated when the expectations
and requirements of ministries and boards clash; practitioners strongly oppose getting involved
in such a “battle of dueling experts.”
Discussion
The data collected in this online survey shows that use of RBI by individual practitioners
continues to be a challenge. In accordance with previous findings (Bérubé, 2005; Kretlow &
Helf, 2013; Williams & Coles, 2007; Lysenko et al., 2014), practitioners’ self-reports are marked
by low-threshold use. Despite this infrequent use, our data did reveal a pattern of sources used,
hinting at the nature of information favoured by school practitioners. When compared with
previous findings (e.g., Williams & Coles, 2007), the number of practitioners who use the World
Wide Web for sources has increased, with practitioners commenting on the Web’s ability to give
relatively unrestricted access to information. Also remarkable is the importance given to peer
exchanges of practice-relevant information, with the open-ended reports implying that colleagues
(both near and far) are a primary source of information.
However, Internet sources and collegial networking may not always have a foundation in
research findings. For instance, despite the fact that a number of practitioners referred to
governmental, professional, and school board Web sites and electronic databases as their major
sources, others reported their reliance on blogs, general search tools like Google and Yahoo, and
social media platforms. Schneider’s (2008) qualitative study of “edubloggers’” online training
practices suggests that rarely did these bloggers validate their content, indicate references, or
even take any training courses. While collegial networking can bring research to the front lines
of educational practice quite quickly, the quality of the “word of mouth” stratagem should not be
over-estimated either. Davis (2008) argues that these accounts of research are often truncated,
decontextualized, and blended with anecdotal experiences of the colleague, rendering them
largely inaccurate.
Contrary to the argument that practitioners are more prone to use local data, which they
value for its contextual relevance as opposed to the research generated in academia, the
self-reports revealed low use of school evaluation data. Only 6% of commenters mentioned that
assessment data informed their teaching. Whether the data are accessible or usable for making
individual or collective decisions is unknown. However, the comments suggest that these
contextually relevant data do not often trickle down to the interested teachers.
It is important to note that respondents rarely referred to teacher education programs as a
source of RBI. According to Cousins and Walker (2000), only novice practitioners refer to
teacher education programs to compensate for their lack of experience. In our sample, only 17%
of respondents had less than three years of experience, and there was no observable relationship
found between teaching experience and the use of RBI. Consistent with the literature (e.g.,
Cousins & Walker, 2000), coursework in research practice as well as prior participation in
research projects were associated with the self-reported use of research-based information. On
the one hand, this link implies that the effects of formal training in which research practice is
Page 16
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
50
Brock Education Journal, 25 (1), Fall 2015
incorporated are likely to stay with practitioners. On the other hand, research experience gained
in their workplace through participation in research projects may also increase teachers’
engagement with RBI.
In terms of how practitioners use RBI, the self-reports suggest predominantly
instrumental uses. However, respondents would rather use RBI for the broader purpose of
improving professional practices than for finding solutions to everyday issues. It is possible that
the research findings aren’t presented in a way that offers answers to specific issues of practical
concern or perhaps practitioners are simply used to going elsewhere (e.g., to colleagues) to find a
quick fix.
Although existing studies report that educational practitioners express positive attitudes
towards educational research (e.g., Ratcliff et al., 2005; Williams & Coles, 2007), the
respondents in this study took a neutral stance and we can only speculate about the reasons for
their position. Research knowledge utilization has become a buzzword in education and
practitioners might feel obliged to hide their negativity about the value of educational research to
conform to the prevailing winds. This supposition receives support from the negative comments
made about RBI, namely that research isn’t rooted in school reality, is often irrelevant, is
difficult to access, and contains convoluted language. The fact that these comments were given
by the respondents who reported having used research should raise the red flag to communities
engaged in research production and dissemination, including universities and research-brokering
organizations inside and outside the school system.
Both the qualitative and quantitative data reveal the importance practitioners attribute to
the potential of research to improve practice. They gave clear and explicit requests for research
publications to offer more specific recommendations for applying research findings to practice.
Practitioners valued professional development activities as a way to raise awareness of RBI.
They suggested information literacy skills together with the ability to appraise and act upon
research evidence combined with a willingness and openness to change as being critical for the
use of research to inform classroom practice. At the same time, the availability and accessibility
of external expertise to help with basic understanding of research and its adaptation and
application was also valued. Practitioners noted the importance of structural support, especially
time and collaboration, in the successful use of research knowledge. Time especially was
identified as essential for searching out sources, reflecting on the material, and incorporating
ideas into practice.
Conclusions
Although the selected research methodology has limitations such as correlational design,
self-reports, and non-probabilistic sampling, several lessons can be gleaned from this study,
particularly about the need to cultivate positive attitudes toward and capacity of research use.
Firstly, teacher education programs can help practitioners develop a solid foundation in the skills
needed to engage in and with research by formulating questions, finding solutions by searching
for and appraising existing information for appropriateness, applying the information, and
evaluating its effectiveness. Teacher programs should balance courses in curriculum and
instruction with systematic inquiry activities and classes that emphasize the practical value of
evidence generated from research. Ideally, professors should communicate an interest in
pedagogical and education research to underscore the importance of such research in informing
Page 17
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
51
Brock Education Journal, 25 (1), Fall 2015
classroom practice. Secondly, a more systemic approach is needed to ensure that using
research-based information is not abandoned once novice practitioners enter the school system.
Research-based information should be readily accessible to busy practitioners and decision
makers. There needs to be a “place,” a public resource where such information can be brought
together, assessed for quality and relevance, and then organized for professional use. Effective
models for condensing research-based information and communicating it to educational
professionals can be found in the health and social care fields. Similarly, expansion of research
brokering organizations as intermediaries between educational research and practice may
stimulate research use by their capacity-building functions, implementation support,
organizational development, and policy influence (Cooper, 2014). Finally, there need to be
structures in place to support the practitioner. These structures should: (a) provide access to
research that is written for non-scientists and accompanied by clear recommendations and
demonstrations on how it is to be effectively applied, (b) integrate on-going research-based
professional development, (c) create opportunities and stimulating intellectual needs to share
experience gained in research implementation, and (d) put in place administrative and
managerial support structures for the time and energy required.
In sum, this research suggests that there is a way to go to ensure that knowledge
generated by educational research routinely informs educational practice. Systemic and
coordinated efforts by those on both research and practice sides—including school boards,
education faculties, and knowledge mobilization agents—are required to build the school
capacity structure and enable teachers to be key innovation agents whose teaching makes a
difference at the end of the day.
Page 18
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
52
Brock Education Journal, 25 (1), Fall 2015
Acknowledgements
This work was supported by a Social Sciences and Humanities Research Council of Canada
Presidential Fund (Grant # S 012115).
Page 19
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
53
Brock Education Journal, 25 (1), Fall 2015
References
Bérubé, B. (2005). L'accès à la recherche en enseignement et son utilisation dans la pratique :
résultats d'une enquête auprès des enseignants et des enseignantes du préscolaire, du
primaire et du secondaire. [Access to educational research and its use in educational
practice: The survey results of teachers in kindergarten, elementary and secondary
schools]. Retrieved from Conseil supérieur de l’éducation website:
http://www.cse.gouv.qc.ca/EN/Download/index.html?id= 50-2099en&cat=50-2099_EN
Coburn, C. E., Honig, M. I., & Stein, M. K. (2009). What's the evidence on districts' use of
evidence? In J. D. Bransford, D. J. Stipek, N. J. Vye, L. M. Gomez & D. Lam (Eds.), The
role of research in educational improvement (pp. 67-87). Cambridge, MA: Harvard
Education Press.
Cooper, A. (2014). Knowledge mobilization in education across Canada: A cross-case analysis
of 44 research-brokering organizations. Evidence & Policy, 1, 29–59. http://dx.doi.org:
/10.1332/174426413X662806
Cousins, J. B., & Walker, C. (2000). Predictors of educators’ valuing of systematic inquiry in
schools. Canadian Journal of Program Evaluation, Special Edition, 25-52.
Dagenais, C., Janosz, M., Abrami, P. C., Bernard, R. M., & Lysenko, L. (2008). Integrating
research-based information into professional practices by teachers and administrators:
Towards a knowledge transfer model adapted to the education environment: Executive
Summary. Retrieved from
http://www.ccl-cca.ca/pdfs/FundedResearch/Dagenais-ExecSumm-E-SL2006.pdf
Dagenais, C., Lysenko, L., Abrami, P., Bernard, R., Ramde, J. & Janosz, M. (2012). Use of
research-based information by school practitioners and determinants of use: A review of
empirical evidence, Evidence and Policy, 8, 285–309.
Davis, S. H. (2008). Research and practice in education: The search for common ground.
Lanham, MD: Rowman & Littlefield Education.
Estabrooks, C.A. (1999) Modeling the individual determinants of research utilization. Western
Journal of Nursing Research, 21, 758-772. doi:10.1177/01939459922044171
Hughes, M., McNeish, D., Newman, T., Roberts, H., & Sachdev, D. (2000). What works?
Making connections: Linking research and practice. Essex, U.K. Barnardo’s.
Knott, J., & Wildavsky, A. (1980). If dissemination is the solution, what is the problem?
Knowledge: Creation, Diffusion, Utilization, 1, 537-578.
Kretlow, A. G., & Helf, S. S. (2013). Teacher implementation of evidence-based practices in
Tier 1: A national survey. Teacher Education and Special Education, 36, 167–185.
Leech, N. L., & Onwuegbuzie, A. J. (2011). Beyond constant comparison qualitative data
analysis: Using Nvivo. School Psychology Quarterly, 26, 70-84. doi:10.1037/a0022711
Levin, B. (2013). To know is not enough: research knowledge and its use, Review of Education,
1, 2–31. doi:10.1177/0888406413489838
Page 20
Lysenko, Abrami, Bernard, & Dagenais Research Use in Education
54
Brock Education Journal, 25 (1), Fall 2015
Lysenko, L., Abrami, P., Bernard, R., Dagenais, C., & Janosz, M. (2014). Educational research
in educational practice: Predictors of use. Canadian Journal of Education. 37(2).
Retrieved from http://www.cje-rce.ca/index.php/cje-rce/article/view/1477
Neilson, S. (2001). Knowledge utilization and public policy processes: A literature review.
IDRC-Supported Research and its Influence on Public Policy. Retrieved from
http://idrinfo.idrc.ca/archive/corpdocs/117145/litreview_e.html
Organization for Economic Co-operation and Development. (2007). Knowledge management:
Evidence and education. Linking research and practice. Paris, France: OECD
Publications Service.
Organization for Economic Co-operation and Development. (2010). Evidence in education:
Linking research and policy. Paris, France: OECD Publications Service.
Ratcliffe, M., Bartholomew, H., Hames, V., Hind, A., Leach, J., Millar, R. & Osborne, J. (2005).
Evidence-based practice in science education: the researcher–user interface. Research
Papers in Education. 20, 169–186. http://dx.doi.org/10.1080/02671520500078036
Rogers, E.M. (2003). Diffusion of Innovations (5th ed). New York: Simon and Schuster.
Schneider, K. (2008). A qualitative study of five authors of five blogs on training and
development. Unpublished MA dissertation. Concordia University, Montreal, Canada.
Retrieved from http://spectrum.library.concordia.ca/975987/1/MR45326.pdf
Sharples, J. (2013). Evidence for the Frontline. Retrieved from Alliance for Useful Evidence
website:http://www.alliance4usefulevidence.org/assets/EVIDENCE-FOR-THE-
FRONTLINE-FINAL-5-June-2013.pdf.
Sunesson, S., & Nilsson, K. (1988). Explaining research utilization beyond “functions”.
Knowledge: Creation, Diffusion, Utilization, 10, 140-155.
Walter, I., Nutley, S.M., & Davies, H.T.O. (2005). What works to promote evidence-based
practice? A cross-sector review. Evidence & Policy, 1, 335-364.
Weiss, C.H. (1980). Knowledge creep and decision accretion. Knowledge: Creation, Diffusion,
Utilization, 1, 381-404.
Williams, D. & Coles, L. (2007). Teachers’ approaches to finding and using research evidence:
an information literacy perspective. Educational Research, 49, 185-206.
http://dx.doi.org/10.1080/00131880701369719