DIFFUSION OF META-ANALYSIS, SYSTEMATIC REVIEW, AND RELATED RESEARCH SYNTHESIS METHODS: PATTERNS, CONTEXTS, AND IMPACT Laura Sheble [email protected]Tel: 919-260-2031 21 Rogerson Drive, Chapel Hill, NC, 27517 Current Institution: University of North Carolina Advisor: Diane Kelly [email protected]Tel: 919-962-8065 100 Manning Hall, CB 3360, Chapel Hill, NC, 27599-3360 Degree-granting institution: University of North Carolina Dissertation completed: 11 May 2014
26
Embed
DIFFUSION OF META-ANALYSIS, SYSTEMATIC …laurasheble.web.unc.edu/files/2012/09/sheble_dissertation_summary.pdfdiffusion of meta-analysis, systematic review, and related research synthesis
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
DIFFUSION OF META-ANALYSIS, SYSTEMATIC REVIEW, AND RELATED RESEARCH
Ng, & Jordan, 2003; Ramage & Rosen, 2011) were used to summarize and visualize content of
research synthesis publications; and the knowledge base that informs research synthesis-related
publications in each field was visualized using methods described in Stage I.
In Stage III, bibliometric methods were used to assess the impact of research synthesis
methods on collaboration and citation patterns. Wilcoxon sum rank tests, followed by van
Elteren (1960) tests were used to test for differences. The van Elteren test is an extension of
Wilcoxon: it is a non-parametric test for differences between two groups, controlling for another
factor. The dependent variables were the extent of use of research syntheses versus reviews
11
(citations at five years); and the number of collaborating authors on each type of publication. The
independent variable was the publication type (review or research synthesis). Data was limited to
records for three publication years, 2006-2008.
FINDINGS
Stage I: Macro-level diffusion of research synthesis methods
Across all years (1972-2011), 123,881 records that provide evidence of engagement with
research synthesis were found. The number of publications increased over time and varied by
research field (Figure 1). The greatest increase in the diversity of fields engaged with research
synthesis occurred in the first half of the 1990s, which corresponds with the rise of the evidence-
based practice and policy movements (EBP). A strong correlation between engagement with
EBP and research synthesis methods (ρ = 0.893, p< 0.001) across fields underscores the strength
of this relationship.
Figure 1. Rate of research synthesis publication (left) and rate of change in publication rate (right) differ across science fields. Note: Normalized publication count = number of publications per 10,000.
The slopes and shapes of publication rates (Figure 1, left) depict a range of growth
trajectories. The dense matt in the lower right includes fields that have only recently begun to
show interest in research synthesis methods. Differences in changes in publication rates (Figure
10, right) suggest that engagement with research synthesis has fluctuated to differing extents
12
across fields. It is possible that some fields are more affected by research “fads” and “fashions”
(Abrahamson, 1991), which may be induced externally (fashions) such as by funding agencies,
or internally (fads) from group dynamics.
Stages II and III: Contexts and impact of diffusion
Diffusion within fields: Comparison of the relative distributions of research synthesis
studies across journals in the five fields investigated at the meso-level indicates engagement,
summarized by the Gini coefficient (Table 1), was most pervasive in Social Work, and most
uneven in Women’s Studies. Engagement with the methods in Evolutionary Biology,
Conservation Biology, and ILS was relatively unbalanced, and distributions across journals
approximated the eighty-twenty rule. Contrasts between fields become clearer when the relative
timing of first engagement with the methods is considered. While the Social Sciences and ILS
first engaged with research synthesis in the 1970s and 1980s, the Biological Sciences did not
engage with the methods until the 1990s. The more recent appearance of research synthesis
methods in the biological sciences and relatively consistent increases in use suggest adoption,
though more recent, has been relatively smooth in the biological sciences compared to Women’s
Studies and ILS. Presence of relatively distinct subfields in Women’s Studies and ILS may
contribute to a more punctuated diffusion due to the potential for diffusion to slow when crossing
community boundaries. Comparison of the relative levels of engagement with research synthesis
over time provides greater contrast (Figure 2).
Table 1. Diffusion within selected research fields Field EB CB SW WS ILS Gini Coefficient 0.767 0.775 0.555 0.869 0.796 1st Year 1991 1994 1977 1985 1985 Note: EB=Evolutionary Biology; CB=Conservation Biology; SW=Social Work; WS=Women’s Studies
Modes of engagement: The proportions of publications that reflected engagement with
research synthesis by modes varied across fields (Figure 2). Across all fields, reports of studies
13
in which research synthesis methods were used was the most frequent mode of engagement
(54.22% in ILS to 85.71% in Conservation Biology). Relative proportions of other modes of
engagement, including methods development (e.g., search strategy development) were generally
most prevalent in ILS. The majority of these papers focused on research synthesis in health and
medicine, reflecting the ‘meta-disciplinary’ nature of ILS (Bates, 1999).
a. Evolutionary Biology b. Conservation Biology Key: Modes of Engagement
c. Social Work d. Women’s Studies e. Information & Lib Science Figure 2. Timing, extent, and modes of engagement with research synthesis methods
Resources associated with the diffusion of research synthesis include research funding,
organizations, and availability of information infrastructures (e.g., databases). None of the fields
examined had a primary literature resource that was considered sufficient to identify primary
studies for research synthesis. Funding for work related to research synthesis was available in the
biological sciences, including to develop and apply the methods. In 2008-2011, 61.68% of
Evolutionary Biology and 67.33% of Conservation Biology papers were associated with grant-
funded projects. In contrast, Social Work researchers received little financial support, though this
may reflect funding challenges in the field more broadly. In Women’s Studies, almost a third of
all research synthesis-related papers were associated with funded projects. All of these were
Num
ber o
f Pub
licat
ions
0
10
20
30
40
50
60
70
1975 1980 1985 1990 1995 2000 2005 2010Publication Year
Num
ber o
f Pub
licat
ions
0
10
20
30
40
50
60
70
1975 1980 1985 1990 1995 2000 2005 2010Publication Year
In contrast, traditional research reviews in ILS were produced and used at a higher rate
compared to research syntheses. Additionally, Stage I data indicates ILS researchers engage with
past research at a relatively low rate. These observations indicate that research reviews are a
relatively rare but highly prized – or at least frequently used – resource (Table 5). Though ILS
research syntheses generally have a greater number of authors, they were not associated with
higher levels of citation. A possible explanation for these observations is that, like publications
examined in the second investigation of Levitt and Thelwall (2009), traditional research reviews
were written by ‘elite’ or ‘influential’ information scientists, who tend to have higher citation
rates whether or not they collaborate with others. Such an explanation suggests ILS places a
higher value on researcher experience and expertise, but with the possible consequence of less
egalitarian evaluations of research (Huth, 2009).
Broad implications: ILS applications and open science: Lack of primary centralized
information resources was notable across fields. ILS researchers and practitioners can contribute
to research syntheses through continued efforts to collocate, organize, integrate, and provide
20
access to research materials in a manner that is commensurate with research practices within
science fields. The intersection of research synthesis and ILS examined in Stage II clearly
identified ways ILS has contributed to research synthesis in health and medicine. A broader set
of fields could benefit from ILS research and practice.
Issues of concern to investigators who use research synthesis suggest that open access
and open science practices should be considered in a broad context. To perform a research
synthesis, findings and data need to be reported such that they may be directly incorporated in
subsequent studies. Open access should imply open access to original research studies through
reports and archival data materials that are easy to identify and retrieve through information
systems. As researchers continue to develop methods for data synthesis the importance of
accessible, interoperable, and reusable data will only increase.
On introduction of this research project, it was noted that research synthesis methods, like
collaboration and interdisciplinary scholarship, promote integration of research-based
knowledge, an important counter-balance to specialization. Research synthesis differs in that it
involves integration at levels that are more granular and in response to specific questions. Future
research should investigate relationships between approaches to research integration; and
outcomes of programs designed to promote integration of science knowledge. A better
understanding of differential effects of these approaches would lead to a better understanding of
integration, or synthesis, in science overall, and could inform research policy program design.
21
REFERENCES
Abrahamson, E. (1991). Managerial fads and fashions: The diffusion and rejection of innovations. Academy of Management Review, 16(3), 586-612.
Adler, P. S. & Kwon, S.-W. (2012). The mutation of professionalism as a contested diffusion process: Clinical guidelines as carriers of institutional change in medicine. Journal of Management Studies, 50(5), 930-962. doi: 10.1111/joms.12003
Ansari, S. M., Fiss, P. C., & Zajac, E. J. (2010). Made to fit: How practices vary as they diffuse. Academy of Management Review, 35(1), 67-92.
APA (Publications & Communications Board Working Group). (2008). Reporting standards for research in psychology: Why do we need them? What might they be? American Psychologist, 63(9), 839-851. doi: 10.1037/0003-066X.63.9.839
Asuncion, A., Welling, M., Smyth, P., & Teh, Y.W. (2009, June). On smoothing and inference for topic models. In Proceedings of the twenty-Fifth Conference on Uncertainty in Artificial Intelligence (pp. 27-34). AUAI Press.
Barrios, M., Guilera, G. & Gómez-Benito, J. (2013). Impact and structural features of meta-analytical studies, standard articles and reviews in psychology: Similarities and differences. Journal of Informetrics, 7(2), 478-486. doi: 10.1016/j.joi.2013.01.012.
Bastian, H., Glasziou, P., & Chalmers, I. (2010). Seventy-five trials and eleven systematic reviews a day: How will we ever keep up? PLoS Medicine, 7(9). doi: 10.1371/journal.pmed.1000326
Bates, M. J. (1999). The invisible substrate of Information Science. Journal of the American Society for Information Science, 50(12), 1043-1050.
Biglan, A. (1973). The characteristics of subject matter in different academic areas. Journal of Applied Psychology, 37(3), 195-203.
Blei, D., Ng, A., & Jordan, M. (2003). Latent Dirichlet Allocation. Journal of Machine Learning Research, 3, 993-1022.
Boschma, R. (2005). Proximity and innovation: A critical assessment. Regional Studies, 39(1), 61-74. doi: 10.1080/0034340052000320887
Bourdieu, P. (1988). Homo academicus (P. Collier, Trans.). Stanford, CA: Stanford University Press.
Boyle, E. W. (2012). Methodological challenges in alternative medicine research. International Journal of Science in Society, 3(2), 1-8.
Buckland, M. (1991). Information as thing. Journal of the American Society for Information Science, 42(5), 351-360.
Cadotte, M.W., Mehrkens, L.R., & Menge, D.N.L. (2012). Gauging the impact of meta-analysis on ecology. Evolutionary Ecology, 26, 1153-1167.
Carlile, P. R. (2004). Transferring, translating, and transforming: An integrative framework for managing
22
knowledge across boundaries. Organization Science, 15(5), 555-568.
Chalmers, I., Hedges, L. V., & Cooper, H. (2002). A brief history of research synthesis. Evaluation & the Health Professions, 25(1), 12-37. doi: 10.1177/0163278702025001003
Chubin, D. E. (1976). The conceptualization of scientific specialties. Sociological Quarterly, 17(4), 448-476.
Cooper, H. (1986). Literature-searching strategies of integrative research reviewers: A first survey. Science Communication, 8(2), 372-383. doi: 10.1177/107554708600800217
Cooper, H., & Hedges, L. V. (1994). The handbook of research synthesis. New York: Russell Sage Foundation.
Cooper, H., & Patall, E. A. (2009). The relative benefits of meta-analysis conducted with individual participant data versus aggregated data. Psychological Methods, 14(2), 165-176.
Denis, J.-L., Hébert, Y., Langley, A., Lozeau, D., & Trottier, L.-H. (2002). Explaining diffusion patterns for complex health care innovations. Health Care Management Review, 27(3), 60-73.
Dijkers, M. P. J. M. (2009). The value of “traditional” reviews in the era of systematic reviewing. American Journal of Physical Medicine & Rehabilitation, 88(5), 423-430. doi: 10.1097/PHM.0b013e31819c59c6
Frickel, S., & Gross, N. (2005). A general theory of scientific/intellectual movements. American Sociological Review, 70(2), 204-232.
Harris, J. K., Beatty, K. E., Lecy, J. D., Cyr, J. M., & Shpiro, R. M. (2011). Mapping the multidisciplinary field of public health services and systems research. American Journal of Preventive Medicine, 41(1), 105-111.
Hedges, L. V. (1987). How hard is hard science, how soft is soft science? The empirical cumulativeness of research. American Psychologist, 42(5), 443-455. doi: 10.1037/0003-066X.42.5.443
Huth, E. (2009). The move toward setting scientific standards for the content of medical review articles. Journal of the Royal Society of Medicine, 102(6), 247-251. doi: 10.1258/jrsm.2009.09k004
Hyland, K. (2004). Disciplinary discourses: Social interactions in academic writing. Ann Arbor, MI: University of Michigan Press.
Jennions, M.D., Lortie, C.J., & Koricheva, J. (2013). Using meta-analysis to test ecological and evolutionary theory. In J. Koricheva, J. Gurevitch, and K. Mengersen (Eds.) Handbook of meta-analysis in ecology and evolution. Princeton & Oxford: Princeton University Press (pp. 381-403).
Krippendorff, K. (2004). Content analysis: An introduction to its methodology. Thousand Oaks, CA: Sage.
Latour, B., & Woolgar, S. (1986). Laboratory life: The construction of scientific facts. Princeton, NJ: Princeton University Press.
Levitt, J. M., & Thelwall, M. (2009). Citation levels and collaboration within library and information science. Journal of the American Society for Information Science & Technology, 60(3), 434-442.
23
doi: 10.1002/asi.21000
Leydesdorff, L., Carley, S., & Rafols, I. (2013). Global maps of science based on the new Web-of-Science Categories. Scientometrics, 94(2), 589-593.
Light, R. J., & Pillemer, D. B. (1984). Summing up: The science of reviewing research. Harvard, MA: Harvard University Press.
Littell, J.H. (2008). Systematic reviews and meta-analysis. Oxford: Oxford University Press.
Meho, L. I. & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: Web of Science versus Scopus and Google Scholar. Journal of the American Society for Information Science & Technology, 58(13), 2105-2125.
Merlin, T., Weston, A., & Tooher, R. (2009). Extending an evidence hierarchy to include topics other than treatment: Revising the Australian 'levels of evidence'. BMC Medical Research Methodology, 9. doi: 10.1186/1471-2288-9-34
Morris, S. A., & Goldstein, M. L. (2007). Manifestation of research teams in journal literature: A growth model of papers, authors, collaboration, coauthorship, weak ties, and Lotka's law. Journal of the American Society for Information Science & Technology, 58(12), 1764-1782.
Mulrow, C. D. (1987). The medical review article: State of the science. Annals of Internal Medicine, 106, 485-488.
Myers, G. (1991). Stories and styles in two molecular biology review articles. In C. Bazerman & J. Paradis (Eds.), Textual dynamics of the professions: Historical and contemporary studies of writing in professional communities (pp. 45-75). Madison, WI & London: University of Wisconsin Press.
Pachucki, M. A., & Breiger, R. L. (2010). Cultural holes: Beyond relationality in social networks and culture. Annual Review of Sociology, 36, 205-224.
Patsopoulos, N. A., Analatos, A. A., & Ioannidis, J. P. (2005). Relative citation impact of various study designs in the health sciences. Journal of the American Medical Association, 293(19), 2362-2366. doi: 10.1001/jama.293.19.2362
Peters, H. P. F., & van Raan, A. F. J. (1994). On determinants of citation scores: A case study in chemical engineering. Journal of the American Society for Information Science, 45(1), 39-49.
Platt, J. A. (1996). A history of sociological research methods in America: 1920-1960. Cambridge, UK: Cambridge University Press.
Platt, J. A. (2006). Evidence and proof in documentary research: Part I, some specific problems of documentary research. In J. Scott (Ed.), Documentary Research (Vol. 1, pp. 83-103). Thousand Oaks, CA: Sage. (Reprinted from: Sociological Review, 29(1981), 31-52).
Pullin, A. S., Knight, T. M. and Watkinson, A. R. (2009), Linking reductionist science and holistic policy using systematic reviews: unpacking environmental policy questions to construct an evidence-based framework. Journal of Applied Ecology, 46, 970–975. doi: 10.1111/j.1365-2664.2009.01704.x
24
Rafols, I., & Meyer, M. (2010). Diversity and network coherence as indicators of interdisciplinarity: case studies in bionanoscience. Scientometrics, 82(2), 263-287. doi: 10.1007/s11192-009-0041-y
Ramage, D., & Rosen, E. (2011). Stanford TMT. http://nlp.stanford.edu/software/tmt
Ramsay, J. O., & Silverman, B. W. (2005). Functional data analysis (2nd ed.). New York: Springer.
Rogers, E. M. (2003). Diffusion of innovations. New York: Free Press.
Rosenthal, R. (1979). The file drawer problem and tolerance for null results. Psychological Bulletin, 86, 638-641.
Sander, L., & Kitcher, H. (2006). Systematic and other reviews: Terms and definitions used by UK organisations and selected databases: Systematic review and delphi study: National Institute for Health and Clinical Excellence.
Sahlin, K., & Wedlin, L. (2008). Circulating ideas: Imitation, translation, and editing. In R. Greenwood, C. Oliver, K. Sahlin & R. Suddaby (Eds.), Sage handbook of organizational institutionalism (pp. 218-242). London & Thousand Oaks, CA: Sage.
Sidlauskas, B. L., Ganapathy, G., Hazkani-Covo, E., Jenkins, K. P., Lapp, H., McCall, L. W., . . . Kidd, D. M. (2010). Linking big: the continuing promise of evolutionary synthesis. Evolution, 64(4), 871-880. doi: 10.1111/j.1558-5646.2009.00892.x
Small, H. G. (1986). The synthesis of specialty narratives from co-citation clusters. Journal of the American Society for Information Science, 37(3), 97-110. doi: 10.1002/(sici)1097-4571(198605)37:3<97::aid-asi1>3.0.co;2-k
Stirling, A. (2007). A general framework for analysing diversity in science, technology and society. Journal of the Royal Society Interface, 4(15), 707-719. doi: 10.1098/rsif.2007.0213
Suri, H., & Clarke, D. (2009). Advancements in research synthesis methods: From a methodologically inclusive perspective. Review of Educational Research, 79(1), 395-430. doi: 10.3102/0034654308326349
Toews, L. (2011). The information infrastructure that supports evidence-based veterinary medicine: A comparison with human medicine. Journal of Veterinary Medical Education, 38(2), 123-134. doi: 10.3138/jvme.38.2.123
van Elteren, P. H. (1960). On the combination of independent two-sample tests of Wilcoxon. Bulletin of the International Statistical Institute, 37, 351-361.
White, H. D. (1994). Scientific communication and literature retrieval. In H. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 41-55). New York: Russell Sage Foundation.
Whitley, R. (2000). The intellectual and social organization of the sciences (2nd ed.). Oxford, UK: Oxford University Press.
Yoshii, A., Plaut, D. A., McGraw, K. A., Anderson, M. J., & Wellik, K. E. (2009). Analysis of the reporting of search strategies in Cochrane systematic reviews. Journal of the Medical Library Association, 97(1), 21-29. doi: 10.3163/1536-5050.97.1.004