Top Banner

of 21

Ann Langley, Strategies for Theorizing From Process Data

Apr 05, 2018

Download

Documents

mali
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    1/21

    Strategies for Theorizing from Process DataAuthor(s): Ann Langley

    Source: The Academy of Management Review, Vol. 24, No. 4 (Oct., 1999), pp. 691-710Published by: Academy of ManagementStable URL: http://www.jstor.org/stable/259349 .

    Accessed: 28/07/2011 12:33

    Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at .http://www.jstor.org/page/info/about/policies/terms.jsp. JSTOR's Terms and Conditions of Use provides, in part, that unless

    you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you

    may use content in the JSTOR archive only for your personal, non-commercial use.

    Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at .http://www.jstor.org/action/showPublisher?publisherCode=aom. .

    Each copy of any part of a JSTOR transmission must contain the same copyright notice that appears on the screen or printed

    page of such transmission.

    JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of

    content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms

    of scholarship. For more information about JSTOR, please contact [email protected].

    Academy of Managementis collaborating with JSTOR to digitize, preserve and extend access to The Academy

    of Management Review.

    http://www.jstor.org/action/showPublisher?publisherCode=aomhttp://www.jstor.org/stable/259349?origin=JSTOR-pdfhttp://www.jstor.org/page/info/about/policies/terms.jsphttp://www.jstor.org/action/showPublisher?publisherCode=aomhttp://www.jstor.org/action/showPublisher?publisherCode=aomhttp://www.jstor.org/page/info/about/policies/terms.jsphttp://www.jstor.org/stable/259349?origin=JSTOR-pdfhttp://www.jstor.org/action/showPublisher?publisherCode=aom
  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    2/21

    ? Academy of Management Review1999, Vol. 24, No. 4, 691-710.

    STRATEGIESFOR THEORIZING ROMPROCESS DATAANN LANGLEYUniversite du Qu6bec a Montreal

    In this article I describe and compare a number of alternative generic strategies forthe analysis of process data, looking at the consequences of these strategies foremerging theories. I evaluate the strengths and weaknesses of the strategies interms of their capacity to generate theory that is accurate, parsimonious, general,and useful and suggest that method and theory are inextricably intertwined, thatmultiple strategies are often advisable, and that no analysis strategy will producetheory without an uncodifiable creative leap, however small. Finally, I argue thatthere is room in the organizational research literature for more openness withinthe academic community toward a variety of forms of coupling between theory anddata.

    As change sweeps through industries, organi-zations, and workgroups, we are seeing a surgeof interest among organizational researchers inprocess theory and dynamic phenomena, suchas organizational learning (Cohen & Sproull,1991), competitive interaction (Illnitch, D'Aveni,& Lewin, 1996), innovation and change (Van deVen & Huber, 1990),and strategic evolution (Bar-nett & Burgelman, 1996). One group of research-ers has chosen to address these dynamics byformulating a priori process theories and testingthem using coarse-grained longitudinal time se-ries and event-history methods. Another camphas chosen rather to plunge itself deeply intothe processes themselves, collecting fine-grained qualitative data- often, but not always,in real time-and attempting to extract theoryfrom the ground up (Bower, 1997; Pettigrew, 1992;Van de Ven, 1992). The philosophy of this camp

    is that to truly understand how and why eventsplay out over time, we must examine them di-rectly (Mintzberg, 1979).I identify myself as a member of the secondcamp, but in no way think the task we have setourselves is easy. Process data are messy. Mak-ing sense of them is a constant challenge. In thisarticle I examine a number of different strate-gies for approaching this task. My objective isnot to advocate one strategy or another, or evento propose radically new strategies (although Ido draw on my own research with colleagues indelineating some of them), but, rather, to con-sider the strengths and weaknesses of differentmodes of analysis of process data in terms oftheir capacity to generate theory that is accu-rate, parsimonious, general, and useful (Weick,1979). I further draw attention to the mutual de-pendence between methods and theories.I begin by clarifying what I mean by processtheory and process data and how I conceive ofthe theory-building task. After presenting thedifferent analysis strategies, I discuss their var-ious qualities, place them within an overallframework, and argue for more openness withinthe academic community toward a variety offorms of coupling between theory and data.

    PROCESS DATA AND PROCESSTHEORIZATION:THE CHALLENGEProcess data collected in real organizationalcontexts have several characteristics that make

    I am indebted to colleagues Louise C6t6, Jean-LouisDenis, and Jean Truax, who worked with me on severalresearch projects that inspired this article. I also thank col-leagues, students, and conference participants for theirhelpful comments and discussions and, especially, Chris-tiane Demers, Richard D6ry, Taieb Hafsi, Veronika Kisfalvi,G6rard Koenig, Gilbert Laporte, Jean Perrien, and Raymond-Alain Thietart. Bob Sutton and two anonymous reviewersalso encouraged me to improve the article. An earlier andless developed rendering of these ideas was publishedin French in Management International, under the title"L'6tude des processus strat6giques: Defis conceptuels etanalytiques" (Langley, 1997). This article was preparedwhile I was on sabbatical leave at l'lcole des Hautes etudesCommerciales de Montr6al. I thank the school for their wel-come and their support.691

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    3/21

    692 Academy of Management Review Octoberthem difficult to analyze and manipulate.' First,they deal mainly with sequences of "events":conceptual entities that researchers are less fa-miliar with. Second, they often involve multiplelevels and units of analysis whose boundariesare ambiguous. Third, their temporal embed-dedness often varies in terms of precision, du-ration, and relevance. Finally, despite the pri-mary focus on events, process data tend to beeclectic, drawing in phenomena such as chang-ing relationships, thoughts, feelings, and inter-pretations. I elaborate briefly on these four char-acteristics below.Data Composed of Events

    Process research is concerned with under-standing how things evolve over time and whythey evolve in this way (see Van de Ven & Huber,1990), and process data therefore consist largelyof stories about what happened and who didwhat when-that is, events, activities, andchoices ordered over time. In his classic work onorganization theory, Mohr (1982) makes a cleardistinction between what he calls "variance the-ory" and "process theory." Figure 1 illustratesthis distinction applied to the problem of ex-plaining strategic change.Whereas variance theories provide explana-tions for phenomena in terms of relationshipsamong dependent and independent variables(e.g., more of X and more of Y produce more of Z),process theories provide explanations in termsof the sequence of events leading to an outcome(e.g., do A and then B to get C). Temporal order-ing and probabilistic interaction between enti-ties are important here (Mohr, 1982).Understand-ing patterns in events is thus key to developingprocess theory."Events," however, are quite different entitiesfrom the "variables" that dominate methodologyseminars and that most of us are more used tomanipulating. The analysis of process data, there-fore, requires a means of conceptualizing events

    and of detecting patterns among them. As sug-gested by Van de Ven and Poole (1995), these pat-terns may take a variety of different forms, but themost common pattern found in the literature is thelinear sequence of "phases" that occur over timeto produce a given result (e.g., Burgelman, 1983;Rogers, 1983).However, the passage from raw datato synthetic models, whether expressed in terms ofphases or otherwise, is far from simple. Abbott(1990) and Van de Ven (1992) have presented anumber of techniques for analyzing event se-quences when events are sharply defined in termsof the units of analysis they refer to and theirlocation in time. These provide one strategy foranalysis that I will discuss later. However, rawprocess data usually do not come quite so neatlysliced and packaged.Data on Multiple Units and Levels of Analysiswith Ambiguous Boundaries

    Any researcher who has collected qualitativeprocess data in organizations has seen how dif-ficult it is to isolate units of analysis in an un-ambiguous way. For example, what should orshould not be included in the definition of adecision-making process? Can researchers al-ways distinguish (as Eisenhardt, 1989a, did inher research) between the decision to make astrategic change and the decision about whatstrategy should be adopted?More complex phenomena, such as strategyformation or learning, are even harder to isolate.Process phenomena have a fluid character thatspreads out over both space and time (Pettigrew,1992). In addition, one of the main reasons fortaking a qualitative process approach is pre-cisely to take into account the context (Petti-grew, 1992; Yin, 1994). This leads, inevitably, tothe consideration of multiple levels of analysisthat are sometimes difficult to separate from oneanother-made up of a continuum, rather than ahierarchy or a clear classification. This furthercomplicates the sensemaking process.Data of Variable Temporal Embeddedness

    When collecting process data, the researcherattempts to document as completely as possiblethe sequence of events pertinent to the pro-cesses studied. However, unless the process ishighly circumscribed, certain phenomena willtend to be absent from a systematic list of or-

    ' Note that the collection of process data and the design ofprocess studies also pose a number of challenges. However,these are not the main focus of this article. While recogniz-ing that, to some extent, data collection and design issuesconstrain future options, my concern here is how to deal withthe data, once collected. For more on design and data col-lection issues, see Eisenhardt (1989b), Leonard-Barton (1990),Pettigrew (1990), and Yin (1994).

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    4/21

    1999 Langley 693FIGURE 1Two Approaches to Explaining Strategic Changea

    Variance theory Process theory

    Explaining strategic change with Explaining strategic change witha variance model a process model

    Attributes of* environment* leadership Extent of -O* decision Y strategic Strategy 1 Strategy 2

    processes change* performanc * ventsY f(xl,..., xn) * activities* choicesto - >tn

    After Mohr (1982).dered incidents. For example, there are oftengradual background trends that modulate theprogress of specific events. Also, part of whatinterests us may be going on in people's headsand leave no concrete trace of the exact momentof its passing.Despite the apparent temporal precision indi-cated by the word "event," there are also clearlydifferent levels of events: an event may includea bad year, a merger, a decision, a meeting, aconversation, or a handshake. Finally, particu-larly in macrolevel studies of such processes asstrategy making, innovation, and decision mak-ing, the researcher is often obliged to combinehistorical data collected through the analysis ofdocuments and retrospective interviews withcurrent data collected in real time. While thefirst type of data is sparse and synthetic, focus-ing on memorable moments and broad trends,the second is richer and finer grained. And,while the first type misses certain useful nu-ances and details, the second type may requirea certain distancing before it is possible to sep-arate out what is really significant from whatwill be treated as merely noise (Leonard-Barton,1990). These phenomena are often unavoidable,but they all render analysis and interpretationmore difficult.Data That Are Eclectic

    In his work Mohr (1982) insists strongly on thenecessity of keeping variance and process the-

    ories separate. This requirement is extremelydifficult to satisfy. Perhaps for aesthetic rea-sons, Mohr (1982) seems to want to artificiallyseparate variables and events, although, inpractice, phenomena of different kinds are inter-twined. I would argue that the insistence onexclusion of variables from process research un-necessarily limits the variety of theories con-structed. It may be important to understand theeffect of events on the state of an entity (a vari-able) or to identify the effect of a contextualvariable on the evolution of events. Process re-search may also deal with the evolution of rela-tionships between people or with the cognitionsand emotions of individuals as they interpretand react to events (Isabella, 1990; Peterson,1998). Thus, although temporal phenomena re-main one of their distinguishing features, pro-cess data are not composed only of descriptionsof discrete events. They also incorporate a vari-ety of other types of qualitative and quantitativeinformation. Again, this makes analysis and in-terpretation more complex.A process database, thus, poses considerablechallenges. The sheer volume of words to beorganized and understood can create a sense ofdrowning in a shapeless mass of information(Pettigrew's, 1990, much-quoted "death by dataasphyxiation"). The complexity and ambiguityof the data make it difficult to know where tostart. Also, although offering greater potentialfor new discovery, the open-ended inductive ap-proach that most researchers use in process re-

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    5/21

    694 Academy of Management Review Octobersearch tends to lead to postponement of the mo-ment of decision between what is relevant andwhat is not, sometimes aggravating these diffi-culties (Miles & Huberman, 1994).The complexity of process data is, of course, areflection of the complexity of the organizationalphenomena we are attempting to understand.More and more researchers have been question-ing simple process models that assume neatlinear progressions of well-defined phases lead-ing to well-defined outcomes (Schwenk, 1985;Van de Ven, 1992). Although the linear phasemodel still has attractions, process representa-tions now often show divergences from the mainroute, recycling between phases and paralleltracks (Mintzberg, Raisinghani, & Th6or6t, 1976;Nutt, 1984; Schroeder, Van de Ven, Scudder, &Polley, 1989). Researchers are also increasinglyrecognizing that the presence of multilayeredand changing contexts, multidirectional causal-ities, and feedback loops often disturb steadyprogression toward equilibrium. Several schol-ars have, in fact, argued that chaos theory orcomplexity theory may offer the potential forbetter understanding organizational processes(e.g., Stacey, 1995; Thi6tart & Forgues, 1995).Thus, it is clear that we need better ways tomodel process phenomena. However, researchthat concludes simply that "everything is com-plex" or that "simple normative models do notwork" is limited in its appeal. As Van de Ven(1992) notes, process theorization needs to gobeyond surface description to penetrate thelogic behind observed temporal progressions-whether simple or complex. I find it difficult toshare the enthusiasm of some writers for theapplication of complexity theory to organization-al phenomena, precisely because the specificexplanatory mechanisms behind its applicationare often not specified. The general but banalinsight that organizational processes involveopposing forces, nonlinear relationships, andfeedback loops needs fleshing out. One interest-ing point raised by these theorists, however, isthat the interaction of a relatively small numberof simple deterministic elements may generatecomplexity, if they take into account such phe-nomena. With this, there is hope that relativelyparsimonious theoretical formulations may beable to make sense of the complexity observedin process data.And this is where the central challenge lies:moving from a shapeless data spaghetti toward

    some kind of theoretical understanding thatdoes not betray the richness, dynamism, andcomplexity of the data but that is understand-able and potentially useful to others. Through-out the remainder of this article, I examineseven generic strategies for achieving this. Fol-lowing Weick (1979), I term these sensemakingstrategies. The word "sensemaking" is used fortwo reasons. First, it implies the possibility thata variety of "senses" or theoretical understand-ings may legitimately emerge from the samedata. In fact, I argue that different strategiestend to produce different forms of theory that areneither intrinsically better nor worse but mayhave different strengths and weaknesses. Sec-ond, it implies that the closing of the gap be-tween data and theory can begin at either orboth ends (data or theory) and may often iteratebetween them (Orton, 1997). Rigid adherence topurely deductive or purely inductive strategiesseems unnecessarily stultifying. Indeed, Tsou-kas (1989) goes further, arguing that while thedata themselves can yield empirical regulari-ties, abstract conceptualization is required toimagine the "generative mechanisms" that aredriving them. For him, understanding comesfrom a combination of the two.

    STRATEGIES FOR SENSEMAKINGThe seven strategies for sensemaking de-scribed in this section were derived from anin-depth reading of the organization studies andmethods literature and from my own researchexperience. I see the strategies as generic ap-proaches, rather than step-by-step recipes ortechniques. They are not necessarily exhaus-tive, and they can be used in combination. Eachapproach tends to overcome the overwhelmingnature of boundaryless, dynamic, and multi-level process data by fixing attention on someanchor point that helps in structuring the mate-

    rial but that also determines which elementswill receive less attention. It is because of thisthat the strategy used can have an importantimpact on the nature of the emerging theory.Thorngate's (1976) and Weick's (1979) catego-ries of accuracy, generality, and simplicity areused here to consider the theoretical formslikely to be developed using different strategies.Some strategies tend to stick closely to the orig-inal data, whereas others permit greater ab-straction. Close data fitting reflects what Weick

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    6/21

    1999 Langley 695(1979) calls "accuracy." However, accuracy mayact against generality-another desirable qual-ity related to the potential range of situations towhich the theory may be applicable. Finally,simplicity concerns the number of elementsand/or relationships in a theory. It affects thetheory's aesthetic qualities. Simple theorieswith good explanatory power may actually bepreferred to complex ones that explain a littlemore; as Daft (1983) suggests, good research ismore like a poem than a novel.In describing each strategy, I draw on exem-plars in the organizational literature that ap-pear to represent the best of what can beachieved with each approach. In my analysis Ialso look at the relative data needs of eachapproach both in terms of depth (process detail)and breadth (number of cases), as well as theextent to which each strategy deals with each ofthe process data characteristics mentionedabove. Finally, I show how each strategy tendsto favor different types of process understanding("senses"). Some strategies seem best adaptedto the detection of patterns in processes,whereas others penetrate their driving mecha-nisms. Some are more oriented toward themeaning of process for the people involved,whereas some are more concerned with predic-tion. The discussion is summarized in Table 1.Narrative Strategy

    This strategy involves construction of a de-tailed story from the raw data. In the area ofstrategic management, the classic example ofthis style is Chandler's (1964) history of the evo-lution of American enterprise. The same stylealso dominates the work of strategy researcherswho adopt a "contextualist" perspective, nota-bly Andrew Pettigrew and members of the Cen-tre for Corporate Strategy and Change (Petti-grew, 1985, 1990; Pettigrew & Whipp, 1991), butalso others working in this tradition (Dawson,1994; Johnson, 1987). Descriptive narratives (or"realistic tales") are also the traditional tool ofethnographers (Van Maanen, 1988), and they fre-quently play a key role in studies of culturalchange (Bartunek, 1984).In fact, almost all process research involvesrecourse to this strategy at some point. However,the narrative can serve different purposes, de-pending on the objectives of the researcher. Formany it is merely a preliminary step aimed at

    preparing a chronology for subsequent analy-sis- essentially, a data organization device thatcan also serve as a validation tool (e.g., Eisen-hardt, 1989b). For "contextualists" it plays amore substantial role, incorporating an analyti-cal element:Our analytical chronologies reach towards the-ory presentation but are prepared to get on top ofthe data, to clarify sequences across levels ofanalysis, suggest causal linkages between lev-els, and establish early analytical themes (Petti-grew, 1990:280).

    Finally, for others who adopt a constructivist ornaturalistic perspective (Dyer & Wilkins, 1991;Guba & Lincoln, 1994), the narrative can be themain product of the research. The aim is toachieve understanding of organizational phe-nomena-not through formal propositions butby providing "vicarious experience" of a realsetting in all its richness and complexity (Lin-coln & Guba, 1985: 359). For the proponents ofthis approach, it is the contextual detail in thenarrative ("thick description") that will allow thereader to judge the transferability of the ideas toother situations. Indeed, good research of thistype will often produce a sense of "deja vu"among experienced readers. The theorist whoadopts this philosophy tries to avoid excessivedata reduction and to present as completely aspossible the different viewpoints on the processstudied.This strategy avoids commitment to any spe-cific anchor point, although because of the struc-ture of narrative, time tends to play an importantrole. Also, because of its focus on contextualdetail, this approach works best for one or a fewcases. Ideally, the variety and richness of theincidents described and of the linkages betweenthem should convey a high degree of authentic-ity that cannot be achieved economically withlarge samples (Golden-Biddle & Locke, 1993). Inthe hands of an accomplished writer, this sense-making strategy has the great advantage of re-producing in all its subtlety the ambiguity thatexists in the situations observed. It avoids thenecessity of clear definitions when boundariesare not clear, and it easily accommodates vari-able temporal embeddedness and eclectic data.The philosophy behind this type of analysis iswell expressed by Van Maanen: "Tobe determi-nate, we must be indeterminate" (1995: 139).In Weick's (1979) terms, accuracy is thereforeexpected to be high. However, those who adopt

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    7/21

    696 Academy of Management Review OctoberU)U) U)

    U) uU) U) U)rn s:: U) U) U)-4 0)u u

    u u00 4-Z U)

    U)a) (1) 14-4 0 'O +-I U) Q)0 0z! tj +--0 0 0 04 Q)(1) >4 U) $::t)) (3) Z tm uiu 0. O0 > Q)1-4 (3) , ;, 7; U)Q) U) 0 (1) Q)i-4 U 0 0. t r, (I - U)u Q)U) U..4 Q)-4 $::En r. (3) t)) -0-- U 0-4 U) U)0 0 Q) l (3) 0, Q) 0 a) UX U)U (3)) U, 0 U U O1) Q) (3) 0 m O Ocl.0 0 Q)U O U : . -0 U .- a) -0 (3) 40U) O w U 0 i--4 )4 Z,U :E

    U)' U) U)U) 0 0U) > U)U) --40 (3) U) 0 ---I +-U (1) U > (3) UU) >'p -4 ui4.4 (1) 0 -0.,-I 0 ---I l--q = V Ir.,a)) U) ..4 0.4 a)U I 0 +- 0 tj) 0--j 0 N-40 U)0 ui U)0 0 +- r. U) U) >l O 0 .,-I (1) 0 (3) 0 U) (1)-0 0 U) >

    ..U) U) -4 in U2 U4 U) - .-I .,-I4.4 U) 0 4- U) 0 1 0 0 s:: 0) U) U) O -- N- (1) - .,-q,-q J) .' 0) (1) U) 0 U) 0 c- a) CD U) >(1) , 0 - a) U) U., (1)U) (1)U U)U) U 0) 0 0 U)0 > a)U) ., 0 0)0 0 U)a) 0 a) 0 > ((Jj): U U 0 M0 U iz) U U O 8)!: U 1. U) ) - -4rn 0 z 0 z 0 zU) U)cn U) 4-4-- U) U)U) U)4- U)01.4 ui X W >ui 0) - ui O - U1, U) U U) ui s:: +- U) 0 0c a) 0 XI 4- 0.0 0 (1) 4 --.,-I El U)U) U)U) 0U) -q 0 +- -- -"- +- 0 U) U0 0(3) Ut3l O >, U) ui rn U a) >4En '" $:: +- (1) - 4 a) --4 U 0O U 024 -4 0 0U) r U) 0 U-14 1 +- 0U)U U) U) U 0U)a) 0 U)) U) >) U

    0 (1) 0cn U) UQ,.4- 0 > 0 0 uw U Q U) 0-4 0 cnz> Q)cn OC)

    88 uiLO Eli 00 0co a) co CT) U)co 88 (D 0 CT) (7) (1)CY) CO CY) CY) C ---4 CD 0CY) CO CO CY) co t)) 0--4 CY) 00> OC) 0 0.6(3) OC) Cs,V (7) IZ 0 (3) co -4-0 0 >0.6 > (7) , 6 --I OC) OldS4 ,-I - CY) (7)U)0Z .. 4-104 +- (7) 00-4 C-1 0 -4 -4 --4 >4 U)0) (7) (7) (1)U) 0 co.1-4 (7) (7) U)4- +4 +- -0 >40 U U U)U)> U) z 4

    14-4U) 0 U)U)+- --I 0)U0 U) U)U) (1) U) -0--0 U) 1-4 0 (1).O ui U-4 U) , - .- :U) U)ui ui U)- . 'ON-4 +- (1) (1) U 0 0U) U (1)U 4- 0> > S.,w E- w

    0 U)>4 >4 F I

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    8/21

    1999 Langley 697a more traditional research perspective may bedissatisfied because this approach does not, onits own, lead to either simple or general theory.Without denying the usefulness of the narrativeapproach for communicating the richness of thecontext to readers, most of us expect research tooffer more explicit theoretical interpretations.When relying on this strategy alone, one maytoo easily end up with an idiosyncratic story ofmarginal interest to those who were not in-volved and a rather thin conceptual contribu-tion. Appealing process research needs to pushbeyond authenticity to make readers feel thatthey learned something of wider value (Golden-Biddle & Locke, 1993).The instrinsic interest of the phenomenonstudied can sometimes offer this value-for ex-ample, narratives that dig under the surface ofdramatic events can be very effective, as inVaughan's (1996) analysis of the Challenger di-saster. But, beyond this, the most interesting andcompelling narratives (including Vaughan's)are not so purely descriptive. They know wherethey are going. Like Chandler's (1964) stories ofthe invention of the M-form organization, theyhave embedded "plots" and "themes" that serveas sensemaking devices (Woiceshyn, 1997) andthat ultimately become more explicit theories(e.g., "structure follows strategy"). However, si-multaneously telling the complete story whilesetting the plot is a tall order. Other strategiescan help out here.Quantification Strategy

    At the opposite end of the spectrum from thenarrative strategy is a form of process analysisthat has been most effectively promoted by An-drew Van de Ven and colleagues of the Minne-sota Innovation Research Project (Van de Ven &Poole, 1990). In this approach researchers startwith in-depth process data and then systemati-cally list and code qualitative incidents accord-ing to predetermined characteristics, graduallyreducing the complex mass of information to aset of quantitative time series that can be ana-lyzed using statistical methods.For example, in their innovation project, Vande Ven's team first collected detailed real-timedata and then identified five characteristics ortracks that could be used to analyze each iden-tifiable incident (people, ideas, transactions,context, and results). The incidents and their

    corresponding tracks were transformed into aseries of binary codes associated with a specificdate, forming a 0-1 matrix that the authors call a"bit-map." Each incident corresponded to a linein the matrix. One column was reserved to indi-cate positive outcomes (0 or 1) associated withan incident and another was reserved for nega-tive outcomes. A third column was used to indi-cate whether or not there was a change in thepeople involved in the innovation, and so on(Van de Ven & Poole, 1990). Once the coding wascomplete, the researchers worked with the bi-nary data matrix and used statistical methods tosearch for patterns and test theoretical explana-tions. For example, Garud and Van de Ven (1992)and Van de Ven and Polley (1992) tested a dy-namic theory of learning during innovation. Thesame data were used to examine whether thesequences reflected random, chaotic, or periodicprocesses as the innovation evolved (Cheng &Van de Ven, 1996).Similar approaches have been used by Smith,Grimm, and Gannon (1992) to analyze competi-tive interactions among airlines and by Ro-manelli and Tushman (1994) to examine patternsof change in the microcomputer industry. UnlikeVan de Ven's team, however, these researchersbegan with mainly documentary databases con-sisting of newspaper articles, 10K reports, andthe like.

    The advantage of the quantification approachlies in the systematization of process analysis.Assuming that the original data are completeand that the coding of incidents is reliable, de-scriptive patterns in the sequence of events canbe verified systematically and explicit processtheories can be tested rigorously. Note, however,that despite the conversion of the data to quan-titative form, the types of statistical analysisappropriate to process theorizing are somewhatdifferent from those used in most variance re-search.For example, many process theories arefounded on the idea that there are fundamentalsimilarities in the patterns of event sequencesacross cases. However, traditional techniques(regression, ANOVA, and so forth) are designedto explain differences (variance)-not to showsimilarities. The sequence methods proposed byAbbott (1990) include the use of multidimen-sional scaling, to identify "typical sequences"across different cases, and of optimal matchingalgorithms (like those used in DNA testing), to

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    9/21

    698 Academy of Management Review Octoberestimate the proximity between sequences andto develop event-sequence typologies. Sabher-wal and Robey (1993) adopted this method, forexample, to develop a taxonomy of information-system implementation processes based on thedetailed coding of 53 event chronologies.More commonly, however, quantitative re-searchers examining process phenomena haveused techniques such as event-history analysis,lagged regression, log-linear models, and dy-namic simulation. Rather than testing for thesimilarity of whole sequence patterns acrosscases, these methods are appropriate for exam-ining the dynamic relationships between eventswithin a single case or a population. Monge(1990) provides a detailed overview of the theo-retical forms that can be considered in this wayand the appropriate statistical techniques foreach. The approach seems particularly usefulfor the verification of dynamic theories that in-clude causal feedback loops. All this supposes,however, that comparable incidents within thesame case or across similar cases are suffi-ciently large in number to create enough de-grees of freedom for the statistical analysis. Forthis, "incidents" must be defined to be very ge-neric in form, with little contextual richness andvariability remaining attached to them.In contrast with the narrative approach, thisstrategy leads more easily to parsimonious the-oretical conceptualizations (i.e., simplicity). Be-cause of the generic character of coded eventsand the mathematical formulation of the modelstested-often supported by deductive reason-ing-the theorization is also likely to havegreater potential generality (although replica-tion is needed to verify this). Yet, to achieve thisresult, the approach drastically simplifies theoriginal data, setting aside certain dimensionsand replacing the ambiguous, rich, and specificcontext by precise, thin, and general indicators.There is little room for variable temporal em-beddedness or ill-defined boundaries in theemerging models. Accuracy, thus, is not neces-sarily the strong suit of such theories, eventhough the gap between the data and theemerging model may appear to be more defen-sible than in certain other strategies, because itcan be assessed and justified rationally by rea-sonable interrater reliabilities (that take us fromthe data to its coded representation) and goodR-squareds (that get us from the coded represen-tation to the final model).

    In fact, although I have taken part in similarexercises myself and know why we do thesethings, there is a certain irony in the idea thatresearchers who give themselves the trouble ofcollecting rich qualitative data in real organiza-tions are so uncomfortable with this richnessthat they immediately rush to transform it,through another extremely demanding process,into a much thinner data set that can be man-aged in traditional ways. The quantificationstrategy will be much more convincing if it isused in combination with other approaches thatallow contextualization of the abstract data,adding nuances of interpretation and confirm-ing the mechanics of the mathematical modelwith direct evidence. The articles by Garud andVan de Ven (1992) and Van de Ven and Polley(1992) on learning during innovation are inter-esting from this viewpoint. However, those whorely solely on the quantification strategy maylose critical elements of process understandingin abstractions so general that the results ob-tained may be clear but fairly banal.The two strategies just described lie at the twoends of a continuum that opposes empirical ac-curacy and theoretical parsimony. I now presentsome more middle-of-the-road approaches.Alternate Templates Strategy

    In this sensemaking strategy the analyst pro-poses several alternative interpretations of thesame events based on different but internallycoherent sets of a priori theoretical premises. Heor she then assesses the extent to which eachtheoretical template contributes to a satisfac-tory explanation.The strategy was popularized by Allison(1971), in his classic study of the decisions madeduring the Cuban Missile Crisis. The three ex-planatory templates used by Allison were a "ra-tional actor model," in which the United Statesand the Soviet Union were viewed as unifiedrational actors selecting alternatives to achievenational objectives; an "organizational processmodel," in which decision making was seen asdriven by organizational routines (Cyert &March, 1963); and a "political model," in whichindividuals involved in the crisis were viewedas pursuing their own personal interests withina distributed power structure. Allison (1971) pro-duced three retellings of the story, each drawingon a different model. He concluded that the last

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    10/21

    1999 Langley 699two seemed superior to the first, allowing theexplanation of certain events that otherwise ap-peared mysterious.This strategy has been used often since thenfor the study of decision processes (e.g., Pinfield,1986; Steinbruner, 1974), perhaps partly becauseof Allison's example, but also perhaps becauseof the difficulty of developing a unique model ofdecision making that simultaneously capturesall of its dimensions (Langley, Mintzberg,Pitcher, Posada, & Saint-Macary, 1995). Thestrategy also has attracted adherents among in-formation systems researchers concerned withimplementation processes (e.g., Lee, 1989;Markus, 1983). In strategic management thework of Collis (1991) on globalization also re-flects this approach.Because this strategy draws theory from out-side the data, it is essentially deductive. In someapplications predictions of the competing theo-ries are formally "tested" in a hypothetico-deductive fashion, with specific predictions be-ing refuted to reject weaker theories (e.g.,Markus, 1983). This is similar to Yin's (1994) ideaof "pattern-matching." Often, though, the differ-ent interpretations are less like true "tests" oftheory and more like alternate complementaryreadings that focus on different variables andlevels of analysis and reveal different types ofdynamics. Many broad process theories, such aspolitical models (Allison, 1971), organizing the-ory (Weick, 1979), or structuration theory (Gid-dens, 1984), are alternative modes of sensemak-ing that are not easily refutable because theirconstructs seem adaptable (e.g., in political mod-els it is usually possible to find personal goalsthat make observed action rational). However, aconfrontation among different interpretations canreveal the contributions and gaps in each.Although some researchers have counseledagainst using single case studies in process re-search because of the lack of material for repli-cation and comparison (Eisenhardt, 1989b; Petti-grew, 1990), this strategy provides a powerfulmeans of deriving insight from a single richcase because the different theoretical interpre-tations provide the base for comparison needed(Lee, 1989; Yin, 1994). Each interpretation strat-egy may also force the researcher to collect dif-ferent types of data, with the more finelygrained theories actually becoming very de-manding, further revealing the relative contri-bution of each perspective (Allison, 1971).

    Overall, this strategy combines both richnessand theoretical parsimony (simplicity) by de-composing the problem. Qualitative nuancesare represented through the alternative expla-nations, and theoretical clarity is maintained bykeeping the different theoretical lenses sepa-rate (at least in most applications of this ap-proach). Between them, then, different theoreti-cal perspectives provide overall accuracy,although each one is inaccurate on its own. Gen-erality in this approach comes from the use ofdeductive theories that have broad application.However, despite its advantages, the use ofthis strategy often leaves the researcher and thereader puzzled as to how the various theoreticalperspectives can be combined. Almost inevita-bly, each explanation taken alone is relevantbut insufficient. Yet, any theory that attemptedto integrate the different perspectives wouldtend to become unwieldy and aesthetically un-satisfying. As Allison indicates at the end of hisbook:

    The developed sciences have little hesitationabout partial models.... The aspiring sciencestend to demand general theory. In satisfying thisdemand, they often force generalization at theexpense of understanding. Refining partial para-digms, and specifying the classes of actions forwhich they are relevant, may be a more fruitfulpath to limited theory and propositions than theroute of instant generalization (171:275).Grounded Theory Strategy

    As noted by several meta-analysts of qualita-tive research (Larson & L6wendahl, 1995;Locke,1996), Glaser and Strauss's (1967) The Discoveryof Grounded Theory is one of the most-citedmethods texts in qualitative research articles.And, yet, there is sometimes limited evidence inthese articles of the systematic theory-buildingmethods proposed by the authors and subse-quently refined by Glaser (1978) and Strauss andCorbin (1990; see Locke, 1996). For many,"grounded theory" is basically a generic syn-onym for any kind of inductive theorizing. This isperhaps not surprising, for the language itselfexpresses this idea. However, in this article I usegrounded theory strategy to refer to the more spe-cific methods described by the original authors.When followed "by the book," the groundedtheory approach as described most recently byStrauss and Corbin (1990) incorporates a seriesof highly structured steps. It involves the sys-

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    11/21

    700 Academy of Management Review Octobertematic comparison of small units of data (inci-dents) and the gradual construction of a systemof "categories" that describe the phenomena be-ing observed. The categories may have several"subcategories," and associated "dimensions"and "properties," which are gradually elabo-rated and refined as specific incidents are ex-amined, systematically coded, and compared.As the categories are developed, the researcherdeliberately seeks out data that will enable ver-ification of the properties of emerging categorysystems. The analysis should eventually resultin the identification of a small number of "corecategories," which serve to tightly integrate all thetheoretical concepts into a coherent whole firmlyrooted ("grounded") in the original evidence.At first sight, process data offer many oppor-tunities for grounded theorizing. Indeed, Glaser(1978) and Strauss and Corbin (1990) insist on thenecessity of incorporating processes into anygrounded theory study. They note that processesare categories that have two or more identifi-able "stages" (Glaser, 1978) and that the mostuseful core categories are often expressed asgerunds (i.e., in process terms). Severalgrounded theory process studies in the literatureare faithful to this portrait (e.g., Sutton's, 1987,model of organizational death as "disbanding"and "reconnecting" and Gioia & Chittipeddi's,1991, representation of the initiation of strategicchange as "sensemaking" and "sensegiving").However, I would argue that the strategy"makes more sense" for some types of processdata than for others. Generally, it demands afairly large number of comparable incidentsthat are all richly described. Thus, while onesetting may be sufficient, there should at leastbe several distinct processes that can be com-pared in depth (e.g., as in Burgelman's, 1983,internal venturing study). Alternatively, thelevel of analysis can be dropped away from theoverall site to a more microlevel to explore theinterpretations and emotions of different indi-viduals or groups living through the same pro-cesses (e.g., Isabella, 1990; Sutton, 1987). It ishere that the strategy often appears at its mostpowerful. However, when the objective is to un-derstand more macroscopic processes that occurone at a time over long periods (like strategicchange in a large organization), the processes'broad sweep seems to fit less well with themicroanalysis of the textbook grounded theoryapproach. The data themselves may not have

    the density required to support it, and the micro-focus risks losing the broad pattern of the forestfor the descriptive detail of the trees.In summary, used alone, this is a strategy thattends to stay very close to the original data andis therefore high in accuracy. It starts with em-pirical details expressed in interview tran-scripts and field notes and attempts to build atheoretical structure "bottom up" from this base.Yet, because of the specialized language, thelogic of the method, and the deliberately hierar-chical structure of category systems, theoriesdeveloped in this way are at the same time verydense (low to moderate in simplicity) but oftenseem to have a similar flavor and general struc-ture (compare, for example, the exemplarygrounded theory studies of very different phe-nomena by Gioia, Thomas, Clark, & Chittipeddi,1994, and Browning, Beyer, & Shetler, 1995). Asits proponents note, firm grounding in the rawdata can also sometimes make it difficult tomove from a "substantive" theory of a specificphenomenon to more general formal theory(Glaser & Strauss, 1967).Visual Mapping Strategy

    Process data analysis may involve the manip-ulation of words (e.g., narrative strategies orgrounded theory), of numbers (quantification), orof matrix and graphical forms (Miles & Huber-man, 1994). Such forms have several advantagesover narrative approaches, according to Milesand Huberman (1994). They allow the presenta-tion of large quantities of information in rela-tively little space, and they can be useful toolsfor the development and verification of theoret-ical ideas. Visual graphical representations areparticularly attractive for the analysis of pro-cess data because they allow the simultaneousrepresentation of a large number of dimensions,and they can easily be used to show precedence,parallel processes, and the passage of time.For example, Figure 2 is taken from a study ofthe process of adoption of new technology insmall manufacturing firms (Langley & Truax,1994).The drawing presents an event chronologycoded in multiple ways. The form of the boxesindicates whether the event described repre-sents a decision (round-cornered rectangles), anactivity (sharp-cornered rectangles), or an eventoutside the control of the firm (ovals). The loca-tion of each box in one of the six horizontal

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    12/21

    1999 11 1. . I 1 - 1 1,,1.--- 11, -,I - I 11I-- II11-1. ,-- -- - - I , -- I -\, 1",',`'- --- -- - --/ I-, , -1-1- , , - -- .I , I-, "."'.,'I: N ", - 1-1: - I i:11 .....-I - - ,, ...- I - .", I I11 ---11 11 .,-- - II I ",".I::, , , '....,...It 1111i 11 - - I- ! I -- 1''. 11 - 11111 I, I I,I 1 I I I I i : : - : - - ,> I I w Iii i : : : : 1. - I . II, . 11 I , I i::i, - II :! :, __,., -I I ; - I11 I - ...M i i : : ii :i -, I 0 11111M0 1 W Im- - I - I "I'',E- - - 1, . I - u 0- I , I I I . I:I I I I . I i T.ii11I I I 11 -I e:I : I : , 1 .1 11 . 1 , - I 11 - , ,4 ;- En I , '' -- -0 1;1 I .I I - I - ''Ill., i] : : 1 1 1;I- , 11-D - Ii : I - "I - -:: :: ;: : : . .II I o 5.. :i ---, 0 1 ::>r.I I E . 7 : i . - I II IeIII ..& 1=1 ,, I I 1. 11 . tj 1 0 - -, - -I I 1. : ;: I,i,::: - : :: . : : - V ZI ], i - I .1111 I1 I : V. 2 m , Q> 1 1i: I I 1, m I.I 1.I - : I- : : . I;:1 .- : I I : i : I ,. I - -.- 1- 'll- . cn a).,- : ::: , ;: :: , I IiiX - 1.111, - - I 1.11 I I1, I 1 4 .2 .2 X --- VI - II 11 , I - u : 1% w -I I . I - 1- 1 - I 1-1 1 ., .1 IE- j -, - 1 - I 11, -11 :: "I"..'', - - w i- ''I 'l lI r4 I I I - 1 1 ::,::,!0 I . 11, - --1-::1i:] E- i z .2, i 1.1,111,II I I -X :1 I I'll : : : .- il--',Ii.i;- 11,1I: 11 "II I 11 - 111- 1 1. 1 111 I I 1.- I 11 :: U tn tn 4) I , I = ." -11-1'1---V, : :- - - , 11 ,-IW, , , "- 2 1,I - I ,:, ': I -:X ",I1, ... I : 0 0 u ",,...:.I11 ''I ll - ,. I- - 1-1,11-1-11,11--111 -, -1 -1:4 I I 11-'ll, "I 0 - i. -0 -;-.I :I I+ - .1I'll I I II I--, --, Xm 0 .2 0 0 `',',.-.-..,.i- > 0 Z ,0 I I I I 1,", ,'11::11:1 ,I, Z:, : I11- 11'll I - - I1- 11,II - - -- -1, C=I 0 1 ;'II tM (D- 0U l"Ii m - - I 'I'll - 1; 'O:; - 11 '111-.III'll I - --l-1--l- m "- - . I"II - I .11, I " 0 - CD11 1- 11 -1, - I'll r. r. r. ,'-%`..'111- .) . - --1- , .- .:-.-0 X i u 0 , a) -t, ,,!,IIi2 11-. n E', 4u, 6 -- . - I, I I -1-> 11 - I 0 . , iiiii;i t, C:) - 0 1!, 1.3 1, --I]]I 1,--,,,,-,11:ii,,il:,.0 4) r. . = -i,,i. En . c tj .. Q -I "i I0 , f Do,- R- ::iz . 1. , 4 .4 1,.'iI I1 1 .5 o - , ti ii.'..- - to II - 11 , - M .-i- 0 m r. - - tn 0I- ii,. -m i i N r. '..i , 1 .. I. II ,11Il 1. i v tj u 0 0 d En I.I : : - ---i W M u. I I , 1. (D , lii,w ."I 0 a) :* = .2 ,2I I I I. - I, i I I ,-0 ,-:,:1 W "ii:"- I o II I ; i.." iiI- ..., 10 10 ,1- " I I -- cn .- .1,I .: I I - --, Ii,I I I I- - - 1, 0) ;,` , o r.- I-,I - , A A U .1I 1' 1: 11,I I I I. 11,I >- I 11 II . " u - Ili]l, t rz 't; - -u- I I I I: `-I 0 111-1- ,!I 11 II - , I ""' - --I I , - , ; ::: :i ,'..'.I I 1- 1 I I I - 77- I .111, I I I I I I I . - ,-I I I I I ',,-, -- , 11-1-111, . ,, ] ],I ,.,,I'l, I - ' 'I'll1. .,"11 iiii . II - 1- : :;: ,",I I -I l l, I I I I ,',-I I I ---- , 1-1-1,111 - I I II 1-1I I I'l I ..I 11- - I ., ''''I- - 11 - II : ,, - 1,I '111,11-- "I - fn, I - I , li - ,--- I I - I - I".% : I - , '' I' l'' I I 111- 11-.... II''. 11 I I'll, I1 11 , 1-1,11, 11 '11,111 , : , , 11,1',,..-"'I'll I - 11 1 - I I fn - - ::::I I - I I I - I 1- ' 'I ll 11' l'', : , : :,;,,;:- - ''I'll'' - I II 1-111 I I , - 7 f i: - :: : I 1 1- I I I I I - 1-I 0 1 1-1-11 -- I 1"Ill,I11 I- 11I 1-1--111 - ----II -- I - I .4 111-111- - :I , - I11 I I 1111 - -:- -1, III 111- 11-I 111- 1 - - - ' IllII - - I'll,,,-II'll - I - : ; :-.:ir: 1- -- :ii::i, 4 A -1 !::; -fn 0 0 u 111-II I 111.1 . . . - -I - EQ I I I'lI - W 0) '-- (1) .-, 11 :ii :; ;:I''I - I 0 Iq1, W P, - r 10 9j. :: ::o -iI ll, -i-i i!:. -ti .R- fa 0 ---,II I - O 0 15::I111 I L) u 4>.! (D Ei 11 , ;1111:I I 0, - r. .. .,11 10 a) U - U) 0 'n a) . : I:II I I t> 0 0. (1) m -a - (1)- a)0 11-1-1, 0 t, 16 (1):!-i , I W a)- M " (1) 0A 0 -i0,,. I 11 .: - Z , I , 1L) 'a- .- - - u .- >, (1)-i11 I a X. 0 4:3 tn , , X1. -11 11 .I il V . cn>, 0 .8 E- 'mU) >, 0 0. ul:1 (1)..O : II -q 0 . . --- r. t))Ii , , ],i III.' 0 - -(1)I I 11 la EGZ .. tj -O O 0 X. .0 r. a) JD - ---- I i (L) -14 0 g) (1)11 :i]i]]]. I: : - . - a) , . 'n 'tn - - L) 0 2 011, -- - 1, O O - I 11II,] > 4)- 111]:i I- I 0 5 - :3 :::) (1) 0 0 ! m 'A ,1111II - I 6 -1 - I 11-,I - - ID. ","I'',1- ,I-r.) -0 .2,I'll I 11 11 11- U O ,3 O 0 (D'll, "I'llI r. E - ti -a ,'ll, "I .5 = (1) ! 111iiaii,I I I'll, I"I''," I - .20 -0. I: . iiiiii, II 0 O- .- I :i -I I I - I I I 0, r. . O *E-- Ei E'- O, I",",11 C'nm 4 A L: ,II I I'II I I I1:I 11I I - I1, ,i , - I - -- , I , I - - I I' - - I ;X1 I . I . I .1 I I , I I I - I - 1, i:I I I I I - 1 - 1 -11, 1-11 -- - I :v : -- ; : I - 'I i - 1 I I I -- II; - . -- - , - - 1, I 1, II I - - II I11' I I -I I - - I ,- , , I I 11- 1, i ,: II- , I ,I I I , I - - I - ,, I i: - I - , , "I I I I - - 11 1 1 I - :i 1, I .1 i,, ,. I : , 1- 1.1I - i : 11 I ; .- i iI I 11 ", ,-I - - 1 : I I I I .. 11' l , I ... I : i , .I - 1 i- I , I I II I I j2i ,: I i : iI t I I i : I . ]: : , : II I I I : ,, lc: i :I I . : I 11 11 I;I: - I - t3) 1: - Q .::: I i:;.I u II, I ,(1) I I -.: ;-I 0 0 I :, v I - IAlliI -) .2 1I - :*-I II > - - =1- - = --:1 II I I ,III- I1 , t 1, I - -. ;&4 1 ,i -Ouu, >, Pi 1. ul I , 111-11.4) > ts - ''I'llI-, -1 : P 0` -11.1- I - .. I .- -- - I - I. 11I- 4 " a) Ar7711 1.11,11 - -1.1.11,-. I E.1g- i: i : ii-t i`.'--ii11 11 101 I - ....,.4 I'll. ,o - > 01IL - - I'll1I, 1 I - I - ..,, : I 0 - ,.!;-- 11 ,- ,, ,-- I I (1) I1-1I riI- - 10 :3 - U ,,I I -, 0 - ::,) I I -< , 0 ! ; -I I I - I I- I- I' I i 0 . . I1 i ] A 1-11-111 I q:j

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    13/21

    702 Academy of Management Review Octoberbands indicates the issue domain with whichthe event is associated. Certain boxes cross sev-eral bands, indicating the integrative characterof that event. The arrows leading from each boxto the central band indicate the effect of thisevent on the technology adoption process (posi-tive effect [+], negative effect [-], precipitatingeffect [++ ], reorienting effect [0]). The thicknessof the horizontal lines linking the boxes indi-cates the degree of continuity among linkedevents. Finally, the horizontal time scale allowsrepresentation of event ordering and paralleltracks over time and provides a rough indicationof their temporal duration. The drawing is obvi-ously a summary of what took place in the case,but the link between it and the qualitative da-tabase is maintained through the use of shortdescriptions of each element in its correspond-ing box.This type of drawing obviously is not a "theo-ry" but an intermediary step between the rawdata and a more abstract conceptualization. Tomove toward a more general understanding,one might, in further analysis, compare severalsuch representations to look for common se-quences of events and common progressions insources of influence (Langley & Truax, 1994).One could also proceed to developing more ab-stract coding to generate local "causal maps"that would constitute the beginnings of a mid-dle-range theoretical explanation (as describedby Miles & Huberman, 1994). Finally, one mightcompare and integrate several such causalmaps to elaborate a more general theory. Lylesand Reger (1993) apply an approach like this intheir study of the evolution of managerial influ-ence in a joint venture over a period of 30 years.Different forms of process mapping have longbeen used by organizations to plan, understand,and correct their own work processes (in sys-tems analysis, quality improvement, businessprocess reengineering, and so forth). Organiza-tional researchers could perhaps learn from thisexample. Meyer (1991) notes how flowcharts ofcapital budgeting processes proved useful inmaking sense of disparate accounts and in com-municating with informants to collect furtherdata. These charts also became the raw mate-rial for the development of both a more compre-hensive process model (Meyer, 1984)and a vari-ance model of innovation adoption (Meyer &Goes, 1988).

    Process mapping also has been a favoredtechnique for decision researchers, constitutingthe foundation of Mintzberg et al.'s (1976) classicarticle and of Nutt's (1984) work on decision pro-cess typologies. In his advocacy of a "grammat-ical" approach to process analysis, Pentland(1995) also implicitly suggests the usefulness ofprocess mapping. He discusses the need to de-tect the underlying rules (grammar) driving theordering of different types of moves (specificoperations) or syntactic elements (groups ofmoves serving similar functions) within a repet-itive process.Approaches like those described requiremany observations of similar processes. Thisindicates that the mapping strategy may bemost fruitful as a theory development tool for theanalysis of multiple holistic or embedded cases.Of course, as a simple presentational method, ithas broader application.Process mapping allows the preservation ofsome dimensions of data ambiguity but ex-cludes others. For example, Figure 2 does notforce artificial clarity on the identification of themain unit of analysis, and it conceptualizestechnology adoption as an evolutionary phe-nomenon that interacts in a dynamic way withother issues important to the firm (Langley &Truax, 1994). Yet this representation gives noroom to such factors as power, conflict, and emo-tion. In part, the range of possibilities for map-ping depends on the researcher's objectives andcreativity (e.g., see Newman & Robey, 1992, forways of representing encounters between actorson process diagrams). However, graphical formsmay be biased toward the representation of cer-tain types of information and against others.Relations of temporal precedence, authority,and influence between objects or individualsare quite easily represented. Continuous tracescould even be used to represent the levels of keyvariables (e.g., financial performance). However,emotions and cognitions are less easy to ex-press in this way, being more difficult to tempo-rally pin down.The graphical strategy, thus, offers a means ofdata reduction and synthesis that is less radicaland more flexible than that used in the quanti-fication strategy (moderate accuracy). However,unless supported by other methods, the conclu-sions derived from it can have a rather mechan-ical quality, dealing more with the surface struc-ture of activity sequences than with the

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    14/21

    1999 Langley 703underlying forces driving them. For this reasonits conceptualizations will tend to be of moder-ate generality. The approach can produce usefultypologies of process components, but attemptsto reach beyond this to deeper generalizationsare often less parsimonious because of the largenumber of variations possible and the difficultyof predicting which ones will occur and why(moderate simplicity).Temporal Bracketing Strategy

    The time scale along the bottom of Figure 2 isdecomposed into three successive "periods."These periods do not have any particular theo-retical significance. They are not "phases" in thesense of a predictable sequential process but,simply, a way of structuring the description ofevents. If those labels were chosen, it was be-cause there is a certain continuity in the activi-ties within each period and there are certaindiscontinuities at its frontiers (Langley & Truax,1994). Many temporal processes can be decom-posed in this way, at least partly, without pre-suming any progressive developmental logic.However, beyond its descriptive utility, this typeof temporal decomposition also offers interest-ing opportunities for structuring process analy-sis and sensemaking. Specifically, it permits theconstitution of comparative units of analysis forthe exploration and replication of theoreticalideas. This can be especially useful if there issome likelihood that feedback mechanisms, mu-tual shaping, or multidirectional causality willbe incorporated into the theorization. We seethis strategy at work in the contributions of sev-eral process researchers (e.g., Barley, 1986;Denis, Langley, & Cazale, 1996; Doz, 1996; Dutton& Dukerich, 1991).We call this strategy "bracketing" in referenceto Gidden's (1984) structuration theory-a clas-sic example of a perspective involving mutualshaping. At the heart of structuration theory isthe idea that the actions of individuals are con-strained by structures (including formal and in-formal rules and norms) but that these actionsmay also serve to reconstitute those structuresover time. Because mutual influences are diffi-cult to capture simultaneously, it is easier toanalyze the two processes in a sequential fash-ion by temporarily "bracketing" one of them(Giddens, 1984). The decomposition of data intosuccessive adjacent periods enables the explicit

    examination of how actions of one period lead tochanges in the context that will affect action insubsequent periods.In his study of structuring in two radiologydepartments following the acquisition of CTscanners, Bailey (1986) consciously adopts thisapproach. He observed how the initial institu-tional context of the departments studied af-fected the pattern of interactions between radi-ologists and technicians and then how thesepatterns evolved and led to changes in the in-stitutional context. This, in turn, became thepoint of departure for another phase of structur-ing. His detailed process data were analyzedand compared across successive periods sepa-rated by discontinuities in the institutional con-text, producing a compelling account of the roleof technology in the evolution of structure.In their study of strategic change under am-biguous authority, Denis et al. (1996) alsoadopted this strategy in order to better under-stand the mutual linkages between the tacticsused by the members of a management teamand the evolution of leadership roles within it.These were traced over five periods, separatedby discontinuities in team membership. Denis etal. (1996) observed that certain types of tacticsfavor the creation of a unified team with thepower to successfully promote change. How-ever, once the team is created, the temptationand the possibility of using more coercive tac-tics lead to the fragmentation of the team, evenas the change is solidified. Alternating dynam-ics, thus, are observed in successive periods.Again, the "periods" become units of analysisfor replicating the emerging theory. Doz (1996)used a similar approach to trace patterns in thecycles of learning and reevaluation associatedwith strategic alliance development.With this strategy, a shapeless mass of pro-cess data is transformed into a series of morediscrete but connected blocks. Within phases,the data are used to describe the processes asfairly stable or linearly evolving patterns. Evi-dence is also drawn together to examine howthe context affects these processes, and whatthe consequences of these processes are on thefuture context and other relevant variables ofinterest. Discontinuities lead to replication ofthe analysis in a new phase.This sensemaking strategy fits well with anonlinear dynamic perspective on organization-al processes, and it can quite easily handle

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    15/21

    704 Academy of Management Review Octobereclectic data that include events, variables, in-terpretations, interactions, feelings, and so on.Because of its internal replication possibilities,one or a few cases may be sufficient to produceuseful insights (all studies cited in this sectionare based on one or two cases only). However,temporal decomposition can create certain dis-tortions. For example, there is no a priori guar-antee that discontinuities will naturally syn-chronize themselves to produce unequivocalperiods. Overall, then, accuracy is likely to bemoderate to high, depending on the appropri-ateness of the temporal decomposition and therobustness of the analysis to different periodiza-tions. Conceptualizations emerging from theprocess are unlikely to be very simple, althoughthey stand a better chance of dealing with fun-damental process drivers than those producedby certain other strategies. Assuming that theyhave been derived inductively, they will also havemoderate generality, until tested on more data.Synthetic Strategy

    One recurring criticism of process theorizingis that despite its capacity to produce enrichedunderstanding and explanation, it often lackspredictive power (Rumelt, 1997; Van de Ven,1992). With the sensemaking strategy that wehave termed synthetic, the researcher takes theprocess as a whole as a unit of analysis andattempts to construct global measures from thedetailed event data to describe it. The re-searcher then uses these measures to comparedifferent processes and to identify regularitiesthat will form the basis of a predictive theoryrelating holistic process characteristics to othervariables (e.g., outcomes and contexts). Thework of Eisenhardt and colleagues (1989a,b;Eisenhardt & Bourgeois, 1988) on decision mak-ing in high-velocity environments is the obviousexemplar for this strategy. Others include Meyerand Goes' (1988) work on technology adoptionand Bryson and Bromiley's (1993) work on newproduct planning and implementation.When this strategy is used, the original pro-cess data are transformed from stories com-posed of "events" to "variables" that synthesizetheir critical components. The emerging models,thus, are "variance theories"-not "process the-ories"-in Mohr's (1982) words. For example,Eisenhardt (1989a) compared eight cases of de-cision making and developed a causal model to

    explain decision speed as a function of five pro-cess constructs: (1) the type of information used,(2) the pattern of alternatives examined, (3) theadvice process adopted, (4) the conflict-resolu-tion approach used, and (5) the degree of inte-gration of decisions. In this case the constructswere developed through inductive explorationand coding of case narratives, as well as certainquantitative indicators. Their linkages to the de-pendent variable (speed) were verified throughtabular displays and investigation of the mech-anisms by which the effects were obtained,drawing on both the data and existing theory.One interesting aspect of these process vari-ables is that they are not necessarily the stan-dard process variables (e.g., use of planning andrationality) that might have been chosen hadthe researcher simply developed a question-naire-based study (e.g., cf. Dean & Sharfman,1996). Rather, they incorporate more subtle nu-ances, including aspects of timing (e.g., simul-taneity of alternatives, deadline versus leader-driven conflict resolution, and real-time versusdelayed information), that could only be de-tected as important through close contact withreal processes. In this way detailed process datacan lead to more meaningful and potentiallymore powerful explanatory variables for nomo-thetic research (see also Nutt, 1993, for anotherexample).However, this is not process theory; the com-plexities of the probabilistic interaction ofevents, parallel and alternate tracks, patterns ofmutual shaping over time, and evolving perfor-mance have been compressed into positions ona small number of scales that can now be re-lated to a single overall "success" assessment.In fact, it is clear that despite major investmentsin the collection of process data, synthetic vari-ance models exert an inexorable attraction. Assoon as researchers become interested in under-standing the reasons for different outcomes,they tend to be drawn into formulating the prob-lem in terms of explanatory variables (see, forexample, even the major longitudinal studies onstrategic change processes by Hinings &Green-wood, 1988, and by Pettigrew & Whipp, 1991).Such an approach can generate important con-clusions-often richer and more credible onesthan could be obtained from thinner cross-sectional data, because the causal links aremore explicitly traceable. Nevertheless, as withthe quantification strategy, care must be taken

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    16/21

    1999 Langley 705not to ditch the detailed temporal understand-ing obtained for its shadow. This means draw-ing (as the researchers cited do) on the entirequalitative database to show how and why thevariables identified lead to the consequencespredicted.In terms of data requirements, the syntheticstrategy requires a clear definition of the bound-aries of the processes studied and a level ofabstraction high enough to permit the compari-son of several cases (accuracy will therefore bemoderate at best). It also requires sufficientcases to allow satisfactory comparison and con-clusion drawing (e.g., Eisenhardt, 1989a, used 8cases and Nutt, 1984, included 78 cases). Thistends to correspond to a thinner level of detail inprocess tracing for each case than for otherstrategies. When the number of cases is moder-ate, this adds to the need to show strong ground-ing of the explanatory mechanisms within thedata itself and to connect these to other litera-ture, in order to make the relationships identi-fied credible and to enhance external validity(Eisenhardt, 1989b). Like the quantification strat-egy, this strategy has the advantage of produc-ing relatively simple theoretical formulationsthat are also moderately general because theyhave been conceived to make sense of data froma number of cases.Qualitative/Quantitative Data versus Process/Variance Analysis: Other Approaches

    The description of seven sensemaking strate-gies for process data is now complete. In myanalysis I assumed that the problem was to con-struct theory from qualitative "process data" col-lected in close contact with real contexts. Thus,I emphasized the large area of overlap betweenqualitative data and process theorizing. How-ever, it is important to note that qualitative datado not necessarily demand process analysisand that process theory can be built from quan-titative data.The first point should be obvious. Qualitativedata can be used for many purposes that havelittle to do with how events are sequenced overtime. For example, they can be used to developrich descriptions of meanings, behaviors, andfeelings evoked by workplace issues at onepoint in time (e.g., Pratt & Rafaeli, 1997, on or-ganizational dress). They can be used to under-stand individuals' mental maps of the elements

    in their world (Huff, 1990) and so on. Some butnot all of the seven sensemaking strategies Ihave described can be used in these non-process situations (e.g., grounded theory in thefirst example and visual mapping in the sec-ond), but my discussion does not pretend to dealwith these rather different applications.The second issue is whether process theory canbe derived from purely quantitative data, such asarchival time series or panel questionnaires. Itcan, of course, using similar statistical techniquesto those mentioned under the quantification strat-egy, but this is not a perspective that I have ex-plored or favored here. Quantitative time seriesconstitute rather coarse-grained outcroppings ofevents and variables over time: they skim the sur-face of processes rather than plunge into themdirectly. Nevertheless, such methods are rapidlypenetrating the strategy field and contributingsignificantly to a more dynamic understanding ofstrategic evolution (e.g., Barnett & Burgelman,1996).As such, they are complementary to the ap-proaches discussed here. Indeed, as Ven de Venand colleagues' work has shown, there is much tobe gained from collecting both quantitative timeseries and qualitative stories in the same processresearch effort (Brewer & Hunter, 1989).It is also worth mentioning another quantita-tive approach to developing process theory that,at first sight, appears to be even more distantfrom real processes because its "data" are en-tirely artificial. This is computer simulation, ofwhich the most influential examples are Cyertand March's (1963) behavioral theory of the firmand Cohen, March, and Olsen's (1972) "garbagecan" model of organizational choice (but seealso Sastry's, 1997, formalization of the punctu-ated equilibrium model of change and Lant &Mezias's, 1992,work on organizational learning).As Weick (1979) himself noted, these models arehigh in simplicity and generality but generallyweak in terms of accuracy. Real data may havebeen collected at some time and may have in-spired the ideas behind the model. But, in mostcases, the model is not linked to specific empir-ical observations.2Yet, such models have several advantages.First, provided their basic assumptions are intu-

    2Note, however, that empirical calibration can be at-tempted and may odd to the credibility of such models. See,for example, Hall's (1976) study of the decline and failure ofthe Saturday Evening Post.

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    17/21

    706 Academy of Management Review Octoberitively reasonable, the models can be used assites for risk-free experimentation. Second, be-cause they are not constrained by real measure-ments, they can deal with constructs that wouldbe unobservable in reality (e.g., managerial en-ergy in the garbage can model). Third, they mayallow the detection and correction of inconsis-tencies in existing theoretical frameworks (e.g.,Sastry, 1997). But, above all, these models arepowerful when they show how a few simple andplausible mechanisms can generate complexbehavior patterns that we all recognize. Para-doxically, just like the narrative strategy, whichis, on the contrary, very deeply rooted in real-lifeprocesses, the strength of a simulation comesfrom its capacity to create a feeling of "d6ja vu,"making sense of previously impenetrable expe-rience. (It is surely no accident that the garbagecan model has been popular among academics!)

    DISCUSSIONANDCONCLUSIONThere are constant calls in the scholarly litera-ture for more in-depth process research that willenable us to understand organizational phenom-ena at more than a superficial level. And yet,when we actually go out and get the data requiredto achieve this, we find that the deep understand-ing we sought does not magically leap out at us.Process data are notoriously challenging. In thisarticle I have examined seven generic strategiesfor making sense of them (see Table 1). In thefollowing discussion I review the strategies from anumber of different angles. First, I compare theirpositioning according to Weick's (1979) criteria.Second, I situate them within a more generalframework and examine the ways in which theycan be combined. Third, I consider the roles ofinduction, deduction, and inspiration in the theorydevelopment process.

    Accuracy, Simplicity, and GeneralityAll seven strategies have unique strengths. Butall have weaknesses. As Thorngate (1976) andWeick (1979) indicate, any research strategy de-mands tradeoffs among accuracy, generality, andsimplicity. In particular, accuracy tends to conflictwith both simplicity and generality, while, at leastin my analysis, simplicity and generality tend tobe more compatible (see Table 1). The approxi-mate positioning of each strategy with respect tothe dimensions is illustrated in Table 2. For the

    sake of contrast, I have also included the computersimulation approach in this diagram.This portrait of the different strategies doesnot provide an answer to the question "Whichstrategy is best?" However, it maps the terrainand shows that "good" process research cantake a variety of routes. Some strategies favoraccuracy, remaining more deeply rooted in theraw data (narrative strategy and grounded the-ory). Others are more reductionist, although theyallow the development and testing of parsimo-nious theoretical generalizations (quantifica-tion, synthetic strategy, and simulation).Overall (see Figure 2), the different strategiestend to run the length of an "efficient frontier"that represents the range of tradeoffs betweenaccuracy and simplicity. From a pragmaticstandpoint, the two extremes (simulation andnarrative) are riskier because of the sacrificesthey require on key dimensions. In addition, thealternate templates approach is a special casenot positioned within the table. While each in-dividual template provides simplicity but lim-ited accuracy, between them multiple templatescan increase overall accuracy while maintain-ing simplicity and generality, as long as thetemptation to integrate divergent perspectivesis avoided. The idea that multiple templates canproduce better understandings may also be gen-eralized to the use of multiple strategies, againprovided the combinations are complementaryand provided simplicity is not compromised inthe attempt to achieve integration.

    TABLE 2Sensemaking Strategies and Accuracy,Simplicity, and GeneralityaStrategy Accuracy Simplicity Generality

    High Low LowNarrative IGrounded theoryTemporal bracketingVisual mappingSynthetic strategyQuantif icationComputer simulation Low High High

    The orderings in this table are approximate; there arevariations among specific applications. In particular, whileaccuracy and simplicity are almost always in opposition toone another, the generality of emerging theories will dependon other factors, such as the degree and scope of replicationand the source of the conceptual ideas.

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    18/21

    1999 Langley 707Variations, Permutations, and Combinations

    One way to explore the potential for combina-tions of sensemaking strategies and to organizethem within a common framework is to considerthem as falling into three sequentially linkedgroups that I term grounding strategies, organiz-ing strategies, and replicating strategies.The grounded theory and alternate templatesstrategies can be considered as grounding strate-gies because they suggest two different sourcesfor concepts that can be used within the context ofother strategies. Grounded theory involves data-driven categories, whereas the alternate tem-plates strategy involves theory-driven constructs.The two strategies, thus, represent the puristforms of inductive and deductive reasoning, re-spectively. Both forms of grounding can contributeto the construction of narratives and visual maps,and both strategies can be used as tools in thecomparative analysis of cases (the synthetic strat-egy) or the comparative analysis of phases (tem-poral bracketing). Alternate templates also can beused to test quantitative process models.The narrative and visual mapping strategiescan be viewed as organizing strategies because,as described earlier, they are ways of descrip-tively representing process data in a systematicorganized form. As such, they often, although notalways, constitute the initial rather than finalsteps in the sensemaking process. Both narrativesand visual maps can serve as intermediary data-bases for the identification of phases (temporalbracketing), events (quantification), and con-structs (synthetic strategy) and for the formulationof hypotheses and propositions. Since narrativesare closer to the raw data than visual maps, theymay also precede their development.Finally, the remaining three strategies (tempo-ral bracketing, quantification, and synthesis) canbe considered replicating strategies since theyrepresent different ways of decomposing the datafor the replication of theoretical propositions (byphase, by event, and by case). These strategiescan draw on almost any or all of the others. Quan-tified event data may also be aggregated for usein synthetic case comparisons (Eisenhardt, 1989b)or for comparative analysis of phases (e.g., seeBarley, 1986). Conversely, phase-by-phase infor-mation (Garud & Van de Ven, 1992)or case-by-caseinformation (Cheng & Van de Ven, 1996) may beincorporated into quantitative models.

    This categorization imposes some order onwhat may so far have seemed a rather eclectictypology of sensemaking approaches-but not, Ihope, too much order. The last thing I wish toadvocate is a homogenous recipe for theorizingfrom process data that leaves no room for looseends or creativity. The choice of strategies is morethan just a case of desired levels of accuracy,simplicity, and generality and more than just acase of picking logically linked combinations; it isalso a question of taste, of research objectives, ofthe kind of data available, and of imagination.Moreover, variety contributes to richness. Theseven sensemaking strategies produce seven dif-ferent senses. Method and theory are closely in-tertwined. As I have noted, some strategies tend tofocus on the meaning of processes for individu-als-that is, the way they are experienced(grounded theory and narrative strategy). Othersare better equipped for tracing overall temporalpatterns (visual mapping, quantification, andgrounded theory). Some more easily reveal driv-ing process motors or mechanisms (alternate tem-plates, temporal bracketing, and quantification),and some are more useful forprediction (syntheticstrategy). There are also undoubtedly other strat-egies with which I am less familiar (e.g., literary orcritical approaches) that could make differentkinds of sense again.

    Induction, Deduction, and InspirationBeyond the individual strategies and their bi-ases, my reading of the literature and my ownexperience reinforce the belief that there is astep in the connecting of data and theory thatescapes any deliberate sensemaking strategy aresearcher might decide to apply. As Mintzberg(1989) insists, analysis does not produce synthe-sis. Theory development is a synthetic process.Whatever strategy is used, there will always be

    an uncodifiable step that relies on the insightand imagination of the researcher (Weick, 1989).Wolcott (1994) distinguishes clearly between thetwo processes of analysis and interpretation. In-terpretation corresponds to this creative ele-ment. Clearly, this does not absolve the re-searcher from the need to test his or herinterpretations systematically. Analysis, thus, isimportant to stimulate and verify theoreticalideas. But, unfortunately for those who seek themagic bullet, it cannot produce them alone.

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    19/21

    708 Academy of Management Review OctoberThis also means that persistent calls for cod-ification of qualitative methods (Larson & L6-wendahl, 1995; Orton, 1997)can reach a point ofdiminishing returns, because we just do notknow and cannot tell where that critical insight

    came from. Nobody asks quantitative research-ers to explain how they thought up their concep-tual frameworks (although Sutton, 1997, sug-gests that many may have been inspired by"closet" qualitative research!).Another way to think about this is that theorybuilding involves three processes: (1) induction(data-driven generalization), (2)deduction (theory-driven hypothesis testing), and (3) inspiration(driven by creativity and insight). "Inspiration"may be stimulated by empirical research, by read-ing, by thought experiments, and by mental exer-cises (Weick, 1979, 1989), but its roots are oftenuntraceable. It draws indiscriminately on formaldata, experience, a priori theory, and commonsense. It works when it succeeds in creating newand plausible connections between all of thesethat can be made explicit as theoretical products,exposed to the scrutiny of others, and verified.In closing, this brings me to the question of thenature of the linkage between data and theory inprocess research. In theorizing from process data,we should not have to be shy about mobilizingboth inductive (data-driven) approaches and de-ductive (theory-driven) approaches iteratively orsimultaneously as inspiration guides us. There isroom not only for building on existing constructsto develop new relationships (Eisenhardt, 1989b)but fordesigning process research that selectivelytakes concepts from different theoretical traditionsand adapts them to the data at hand, or takesideas from the data and attaches them to theoret-ical perspectives, enriching those theories as itgoes along. There is also room fordeveloping newstrategies for understanding processes that mixand match those I have presented here or that takea new tack entirely. Sensemaking is the objective.Let us make sense whatever way we can.

    REFERENCESAbbott, A. 1990. A primer on sequence methods. Organiza-tion Science, 1: 375-392.Allison, G. T. 1971. Essence of decision. Boston: Little, Brown.Barley, S. R. 1986. Technology as an occasion for structuring:Evidence from observations of CT scanners and the so-cial order of radiology departments, Administrative Sci-ence Quarterly, 31: 78-108.

    Barnett, W. P., & Burgelman, R.A. 1996. Evolutionary perspec-tives on strategy. Strategic Management Journal,17(Special Issue): 5-20.Bartunek, J. 1984. Changing interpretive schemes and organ-izational restructuring: The example of a religious order.Administrative Science Quarterly, 29: 355-372.Bower, J. 1997. Strategy process research. Symposium pre-sented at the annual meeting of the Academy of Man-

    agement, Boston.Brewer, J.,& Hunter, A. 1989. Multimethod research. NewburyPark, CA: Sage.Browning, L. D., Beyer, J. M., & Shetler, J. C. 1995. Buildingcooperation in a competitive industry: SEMATECHandthe semiconductor industry. Academy of ManagementJournal, 38: 113-151.Bryson, J.M., & Bromiley, P. 1993. Critical factors affecting theplanning and implementation of major products. Strate-gic Management Journal, 14: 319 -338.Burgelman, R. A. 1983. A process model of corporate ventur-

    ing. Administrative Science Quarterly, 28: 223-244.Chandler, A. D. 1964.Strategy and structure. Cambridge, MA:MIT Press.Cheng, Y.-T., & Van de Ven, A. H. 1996. Learning the innova-tion journey: Order out of chaos. Organization Science,7: 593-614.Cohen, M. D., March, J.G., & Olsen, J. P. 1972. A garbage canmodel of organizational choice. Administrative Science

    Quarterly, 17: 1-25.Cohen, M. D., & Sproull, L. S. 1991. Editors' introduction:

    Special issue on organizational learning: Papers inhonor of James G. March. Organization Science, 2(1).Collis, D. J. 1991. A resource-based analysis of global com-petition: The case of the bearings industry. StrategicManagement Journal, 12(Special Issue): 49-68.Cyert, R. M., & March, J. G. 1963. A behavioral theory of thefirm. Englewood Cliffs, NJ:Prentice-Hall.Daft, R. L. 1983. Learning the craft of organizational research.Academy of Management Review, 8: 539-546.Dawson, P. 1994. Organizational change: A processual ap-proach. London: Chapman.Dean, J. W., & Sharfman, M. P. 1996. Does decision processmatter? A study of strategic decision-making effective-ness. Academy of Management Journal, 39: 368-396.Denis, J.-L.,Langley, A., & Cazale, L. 1996. Leadership and

    strategic change under ambiguity. Organization Stud-ies, 17: 673-699.Doz, Y. L. 1996. The evolution of cooperation in strategicalliances: Initial conditions or learning processes? Stra-tegic Management Journal, 17(Special Issue): 55-83.Dutton, J. E., & Dukerich, J. M. 1991. Keeping an eye on themirror: Image and identity in organizational adaptation.Academy of Management Journal, 34: 517-554.Dyer, W. G., & Wilkins, A. 1991. Better stories, not better con-structs, to generate better theory: A rejoinder to Eisenhardt.Academy of Management Review, 16: 613-619.Eisenhardt, K. M. 1989a. Making fast strategic decisions in

  • 8/2/2019 Ann Langley, Strategies for Theorizing From Process Data

    20/21

    1999 Langley 709high velocity environments. Academy of ManagementJournal, 31: 543-576.

    Eisenhardt, K. M. 1989b. Building theories from case studyresearch. Academy of Management Review, 14:532-550.Eisenhardt, K. M., & Bourgeois, L. J. 1988. Politics of strategicdecision making in high-velocity environments. Acad-

    emy of Management Journal, 31: 737-770.Garud, R., & Van de Ven, A. H. 1992. An empirical evaluationof the internal corporate venturing process. StrategicManagement Journal, 13(Special Issue): 93-109.Giddens, A. 1984. The constitution of society. Berkeley, CA:University of California Press.Gioia, D. A., & Chittipeddi, K. 1991. Sensemaking and sense-giving in strategic change initiation. Strategic Manage-ment Journal, 12: 433-448.Gioia, D. A., Thomas, J.B., Clark, S. M., &Chittipeddi, K. 1994.Symbolism and strategic change in academia: The dy-namics of sensemaking and influence. OrganizationScience, 5: 363-383.Glaser, B. G. 1978. Theoretical sensitivity. Mill Valley, CA:Sociology Press.Glaser, B. G., & Strauss, A. L. 1967. The discovery of groundedtheory. Chicago: Aldine.Golden-Biddle, K., & Locke, K. 1993. Appealing work:An investigation of how ethnographic texts convince.Organization Science, 4: 595-616.Guba, E., & Lincoln, Y. S. 1994. Handbook of qualitativeresearch. Newbury Park, CA: Sage.Hall, R. 1976. A system pathology of an organization: The riseand fall of the old Saturday Evening Post. Administra-tive Science Quarterly, 21: 185-211.Hinings, C. R., & Greenwood, R. 1988. The dynamics of stra-tegic change. Oxford, England: Blackwell.Huff, A. 1990. Mapping strategic thought. Chichester, Eng-land: Wiley.Ilinitch, A. Y., D'Aveni, R. A., & Lewin, A. Y. 1996. New organ-izational forms and strategies for managing in hyper-competitive markets. Organization Science, 7: 211-221.Isabella, L. A. 1990. Evolving interpretations as change un-folds: How managers construe key organizationalevents. Academy of Management Journal, 33: 7-41.Johnson, G. 1987. Strategic change and the managementprocess. Oxford, England: Blackwell.Langley, A. 1997. L'6tude des processus strat6giques: D6fis

    conceptuels et analytiques. Management International,2(1): 37-50.Langley, A., Mintzberg, H., Pitcher, P., Posada, E., & Saint-Macary, J. 1995. Opening up decision making: The viewfrom the black stool. Organization Science, 6: 260-279.Langley, A., & Truax, J. 1994. A process study of new technol-ogy adoption in smaller manufacturing firms. Journal ofManagement Studies, 31: 619-652.Lant, T. K., & Mezias, S. J. 1992. An organizational learningmodel of convergence and reorientation. OrganizationScience, 3: 47-71.Larson, R., & L6wendahl, B. 1997. The qualitative side of

    management research: A meta-analysis of espoused andused case-study methodologies. Paper presented at theannual meeting of the Academy of Management, Boston.Lee, A. S. 1989. A scientific methodology for MIS case studies.MIS Quarterly, 13: 33-50.Leonard-Barton, D. 1990.A dual methodology for case studies:

    Synergistic use of a longitudinal single site with repli-cated multiple sites. Organization Science, 1:248-266.Lincoln, Y. S., & Guba, E. 1985. Naturalistic enquiry. BeverlyHills, CA: Sage.Locke, K. 1996.Rewriting "The Discovery of Grounded Theory"after 25 years. Journal of Management Inquiry, 5: 239-245.Lyles, M. A., & Reger, R. K. 1993. Managing for autonomy injoint ventures: A longitudinal study of upward influence.Journal of Management -Studies, 30: 383-404.Markus, L. 1983. Power, politics and MIS implementation.Communications of the ACM, 26: 430-444.Meyer, A. D. 1984. Mingling decision making metaphors.

    Academy of Management Review, 9: 6-17.Meyer, A. D. 1991. Visual data in organizational research.Organization Science, 2: 218-236.Meyer, A. D., & Goes, J. B. 1988. Organizational assimilationof innovations: A multilevel contextual analysis. Acad-emy of Management Journal, 31: 897-923.Miles, M. B., & Huberman, A. M. 1994. Qualitative data anal-ysis. Newbury Park, CA: Sage.Mintzberg, H. 1979. An emerging strategy of "direct" re-search. Administrative Science Quarterly, 24: 580-589.Mintzberg, H. 1989. Mintzberg on management. New York:Free Press.Mintzberg, H., Raisinghani, D., & Th6oret, A. 1976. The struc-ture of unstructured decision processes. AdministrativeScience Quarterly, 21: 246-275.Mohr, L. B. 1982. Explaining organizational behavior. SanFrancisco: Jossey-Bass.Monge, P. R. 1990. Theoretical and analytical issues in study-ing organizational processes. Organization Science, 1:406-430.Newman, M., & Robey, D. 1992. A social process model ofuser-analyst relationships. MIS Quarterly, 16: 249-259.Nutt, P. C. 1984. Types of organizational decision processes.Administrative Science Quarterly, 29: 414-450.