Top Banner
JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR DISTINGUISHING BETWEEN DISCRIMINATIVE AND MOTIVATIONAL FUNCTIONS OF STIMULI JACK MICHAEL WESTERN MICHIGAN UNIVERSITY A discriminative stimulus is a stimulus condition which, (1) given the momentary effective- ness of some particular type of reinforcement (2) increases the frequency of a particular type of response (3) because that stimulus condition has been correlated with an increase in the frequency with which that type of response has been followed by that type of reinforcement. Operations such as deprivation have two different effects on behavior. One is to increase the effectiveness of some object or event as reinforcement, and the other is to evoke the behavior that has in the past been followed by that object or event. "Establishing operation" is sug- gested as a general term for operations having these two effects. A number of situations in- volve what is generally assuined to be a discriminative stimulus relation, but with the third defining characteristic of the discriminative stimulus absent. Here the stimulus change functions more like an establishing operation than a discriminative stimulus, and the new term, "establishing stimulus," is suggested. There are three other possible approaches to this terminological problem, but none are entirely satisfactory. Key words: stimulus control, establishing operation, establishing stimulus, deprivation, re- inforcement BACKGROUND CONCEPTS Technical Terminology Skinner (1957, pg. 430) described a three- stage process in the development of scientific verbal behavior: first, more effective forms of verbal behavior are discovered; these are then explicitly adopted and encouraged by the rel- evant technical and scientific community; finally, these verbal practices are themselves critically examined and altered to overcome what seem to be inadequacies or limitations. This paper fits into the third state of this process. Our way of talking about operant stimulus control seems to include but fails to distinguish between two quite different forms of control. We might improve our verbal prac- tices by adopting a new technical term for one of these forms of control or at least by explic- itly recognizing the problem. But before deal- The analysis in this paper developed as a result of discussions with a number of colleagues and students but especially G. Dennehy, C. Cherpas, B. Fulton, B. Hesse, M. Minervini, L. Parrott, M. Peterson, M. Sund- berg, P. Whitley, and K. Wright, who are also respon- sible for many improvements in the present manu- script. Reprints may be obtained from Jack Michael, Department of Psychology, Western Michigan Univer- sity, Kalamazoo, Michigan 49008. ing with this issue it is necessary to refine the concept of the discriminative stimulus or SD and to review the way deprivation affects be- havior. The Discriminative Stimulus It seems in keeping with current usage to describe the presentation of a discriminative stimulus, or the change from SA to SD, in terms of three defining features. It is a stimulus change which, (1) given the momentary effec- tiveness of some particular type of reinforce- ment1 (2) increases the frequency of a particu- lar type of response (3) because that stimulus change has been correlated with an increase in the frequency with which that type of response has been followed by that type of reinforce- ment. (Frequency of reinforcement is the most common variable used to develop the discrimi- native stimulus relation. However, this relation 1It would be more in keeping with current practice to make frequent use of the term "reinforcer," rather than "type of reinforcement" or "form of reinforce- ment," etc. However, I have argued in another paper (Michael, 1975) that "reinforcer" implies a static event, which cannot have the behavioral implications that "re- inforcer" appears to have. "Reinforcement" properly implies stimulus change, and only change can function as a consequence for behavior. Therefore throughout this paper "reinforcer" is deliberately avoided. 149 1982cl, 37, 149-155 NUMBER I (JANUARY)
7

De Citit Establishing Operation. Michael. 1982[1]

Jul 09, 2016

Download

Documents

Oana Bunea

m
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: De Citit Establishing Operation. Michael. 1982[1]

JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR

DISTINGUISHING BETWEEN DISCRIMINATIVE ANDMOTIVATIONAL FUNCTIONS OF STIMULI

JACK MICHAEL

WESTERN MICHIGAN UNIVERSITY

A discriminative stimulus is a stimulus condition which, (1) given the momentary effective-ness of some particular type of reinforcement (2) increases the frequency of a particular typeof response (3) because that stimulus condition has been correlated with an increase in thefrequency with which that type of response has been followed by that type of reinforcement.Operations such as deprivation have two different effects on behavior. One is to increase theeffectiveness of some object or event as reinforcement, and the other is to evoke the behaviorthat has in the past been followed by that object or event. "Establishing operation" is sug-gested as a general term for operations having these two effects. A number of situations in-volve what is generally assuined to be a discriminative stimulus relation, but with the thirddefining characteristic of the discriminative stimulus absent. Here the stimulus changefunctions more like an establishing operation than a discriminative stimulus, and the newterm, "establishing stimulus," is suggested. There are three other possible approaches to thisterminological problem, but none are entirely satisfactory.Key words: stimulus control, establishing operation, establishing stimulus, deprivation, re-

inforcement

BACKGROUND CONCEPTS

Technical TerminologySkinner (1957, pg. 430) described a three-

stage process in the development of scientificverbal behavior: first, more effective forms ofverbal behavior are discovered; these are thenexplicitly adopted and encouraged by the rel-evant technical and scientific community;finally, these verbal practices are themselvescritically examined and altered to overcomewhat seem to be inadequacies or limitations.This paper fits into the third state of thisprocess. Our way of talking about operantstimulus control seems to include but fails todistinguish between two quite different formsof control. We might improve our verbal prac-tices by adopting a new technical term for oneof these forms of control or at least by explic-itly recognizing the problem. But before deal-

The analysis in this paper developed as a result ofdiscussions with a number of colleagues and studentsbut especially G. Dennehy, C. Cherpas, B. Fulton, B.Hesse, M. Minervini, L. Parrott, M. Peterson, M. Sund-berg, P. Whitley, and K. Wright, who are also respon-sible for many improvements in the present manu-script. Reprints may be obtained from Jack Michael,Department of Psychology, Western Michigan Univer-sity, Kalamazoo, Michigan 49008.

ing with this issue it is necessary to refine theconcept of the discriminative stimulus or SDand to review the way deprivation affects be-havior.

The Discriminative StimulusIt seems in keeping with current usage to

describe the presentation of a discriminativestimulus, or the change from SA to SD, in termsof three defining features. It is a stimuluschange which, (1) given the momentary effec-tiveness of some particular type of reinforce-ment1 (2) increases the frequency of a particu-lar type of response (3) because that stimuluschange has been correlated with an increase inthe frequency with which that type of responsehas been followed by that type of reinforce-ment. (Frequency of reinforcement is the mostcommon variable used to develop the discrimi-native stimulus relation. However, this relation

1It would be more in keeping with current practiceto make frequent use of the term "reinforcer," ratherthan "type of reinforcement" or "form of reinforce-ment," etc. However, I have argued in another paper(Michael, 1975) that "reinforcer" implies a static event,which cannot have the behavioral implications that "re-inforcer" appears to have. "Reinforcement" properlyimplies stimulus change, and only change can functionas a consequence for behavior. Therefore throughoutthis paper "reinforcer" is deliberately avoided.

149

1982cl, 37, 149-155 NUMBER I (JANUARY)

Page 2: De Citit Establishing Operation. Michael. 1982[1]

JACK MICHAEL

can also be developed on the basis of rein-forcement quantity or quality, the delay to re-inforcement, the response requirement or effort,and other variables.) The first feature is some-times taken for granted but for the presentpurposes it is better to be explicit: SDs do notgenerally alter response frequency when theorganism is satiated with respect to the typeof reinforcement relevant to that SD. The sec-ond feature is important in distinguishing be-havioral from cognitive accounts of stimuluscontrol, where the stimulus supposedly "sig-nals" the availability of reinforcement, withoutany direct implication for any particular typeof behavior. Whereas the first two features de-scribe the controlling relation once it has beendeveloped, the third identifies the relevant his-tory and thus makes possible a distinction be-tween the operant discriminative stimulus andthe unconditioned and conditioned stimuli ofthe respondent relation. There are a numberof situations involving what is generally takento be an SD because the relation seems so obvi-ously operant rather than respondent, butwhere the third defining feature is clearly ab-sent. An attempt will be made to show thatin some of these situations the stimulus changeis functioning more like a motivational oper-ation such as deprivation or aversive stimu-lation.

The Behavioral Effects of DeprivationHow does deprivation, of water for example,

affect behavior? It is necessary to distinguishtwo quite different effects which cannot beeasily derived from one another. One is anincrease in the effectiveness of water as rein-forcement for any new behavior which shouldhappen to be followed by access to water. Theother is an increase in the frequency of allbehavior that has been reinforced with waterand in this respect is like the evocative effectof an SD. Operant behavior can thus be in-creased in frequency (evoked) in two differentways. Consider, for example, an organism thatis at least somewhat water deprived and forwhich some class of responses has a history ofwater reinforcement. Assume further that thecurrent stimulus conditions have been associ-ated with a low, but nonzero, frequency ofwater reinforcement for those responses. Suchresponses can be made momentarily more fre-quent (1) by further depriving the organismof water, or (2) by changing to a situation

where they have been more frequently fol-lowed by water reinforcement (the SD effect).The distinction between these two ways toevoke operant behavior is the basis for thesuggestion of a new term which is the mainpoint of the present paper.

The Need for a More General Term:the "Establishing Operation"The term "deprivation" has been generally

used for relations such as the one discussedabove but does not adequately characterizemany of them. Salt ingestion, perspiration,and blood loss have similar effects but cannotbe accurately referred to as water deprivation.Aversive stimulation also establishes its ab-sence as reinforcement and evokes the behav-ior that has in the past removed it. Likewisetemperature changes away from the organism'snormal thermal condition increase the effec-tiveness of changes in the opposite directionas reinforcement and also evoke behavior thathas resulted in such changes.

Skinner explicitly identifies deprivation-sati-ation operations and aversive stimulation asmotivational variables (1957, pp. 31-33 andalso 212), and with the term "predisposition"(1953, p. 162) includes the so-called emotionaloperations in this collection. Again, the twodifferent effects of such operations seem clear.For example, to be angry is, in part, to haveone's behavior susceptible to reinforcement bysigns of discomfort on the part of the personone is "angry at," and also to be engaging inbehavior that has produced such effects. Like-wise, "fear," from an operant point of view atleast, seems to consist of an increased capacityfor one's responses to be reinforced by the re-moval of certain stimuli plus the high fre-quency of behavior that has accomplished suchremoval.A general term is needed for operations hav-

ing these two effects on behavior. There is, ofcourse, the traditional "motive" and "drive,"but these terms have a number of disadvan-tages, not the least of which is the strong im-plication of a determining inner state. I havefound "establishing operation" appropriate inits commitment to the environment, and byabbreviating it to EO one may achieve theconvenience of a small word without losingthe implications of the longer term. An estab-lishing operation, then, is any change in theenvironment which alters the effectiveness of

150

Page 3: De Citit Establishing Operation. Michael. 1982[1]

DISCRIMINATIVE AND MOTIVATIONAL FUNCTIONS OF STIMULI

some object or event as reinforcement and si-multaneously alters the momentary frequiencyof the behavior that has been followed by thatreinforcement. (Sttudents of J. R. Kantor maysee some similarity to his term "setting factor,"but I believe that his term includes operationsthat have a broader or less specific effect onbehavior than the establishing operation asdefined above.) There is still a problem withthis usage in that "establishing" implies only"increasing," but changes obviously occtur inboth directions. "Deprivation" has the samelimitation and is usually accompanied by itsopposite "satiation." It does not seem usefulat this time to introduce the term "abolishing"to serve a similar function, so perhaps in thepresent context "establishing" should be takento be short for "establishing or abolishing."The value of a general term is more than

just terminological convenience, however. Theabsence of such a term may have been respon-sible for some tendency to disregard such ef-fects or to subsume them under other head-ings. For example, it is common to describethe basic operant procedure as a three-termrelation involving stimulus, response, and con-sequence. Yet it is clear that such a relationis not in effect unless the relevant establishingoperation is at an appropriate level. A stim-ulus that is correlated with increased frequencyof water reinforcement for some class of re-sponses will not evoke those responses if wateris not currently effective as reinforcement.

Furthermore, aversive stimuli and otherevents like temperature changes that quicklyevoke behavior may appear to be discrimina-tive stimuli but should not be so considered,although the argument is somewhat complex.As mentioned earlier, in order for a stimulusto be considered a discriminative stimulus thedifferential frequency of responding in its pres-ence as compared with its absence must be dueto a history of differential reinforcement in itspresence as compared with its absence. Al-though it is not usually mentioned, there isin this requirement the further implicationthat the event or object which is functioningas reinforcement must have been equally effec-tive as reinforcement in the absence as in thepresence of the stimulus. It would not, forexample, be considered appropriate discrimi-nation training if during the presence of theSD, the organism was food deprived and re-ceived food as reinforcement for responding,

but in the absence of the SD it was food sati-ated, and responding was not followed byfood. Such training would be considered ofdubious value in developing stimulus controlbecause during the absence of the SD, the sati-ated organism's failure to receive food afterthe relevant response (if that response wereto occtur) could not easily be considered "un-reinforced" responding, since at that time foodwould not be functioning as reinforcement.The type of differential reinforcement rele-

vant to stimuilus control ordinarily implies thatin the presence of the stimulus, the organismreceives the reinforcement with greater fre-quency than it receives the reinforcement inthe absence of the stimulus. If in the absence ofthe stimtulus, the critical event no longer func-tions as reinforcement, then/ receiving it at alower frequency is not behaviorally equivalentto a "lower frequency of reinforcement." Thissupplement to the definition of the SD relationis not usually mentioned simply because theSD-SA concepts were developed in a laboratorysetting with food and water as reinforcementand with the SD and SA alternating during thesame session. If food was reinforcing duringthe presence of the SD, it would generally beequally reinforcing during its absence. Withestablishing operations that affect behaviormore quickly, however, this aspect of the defi-nition becomes more critical. Aversive stimu-lation is just such an establishing operation.Consider a typical shock-escape procedure. Theorganism is in a situation where the shock canbe administered until some response, say alever press, occurs. This escape response re-moves the shock for a period, then the shockcomes on again, and so on. With a well-trainedorganism the shock onset evokes an immediatelever pressing response, and since the relationis obviously an operant one it might seem rea-sonable to refer to the shock as an SD for thelever press. For the shock to be an SD, it musthave been a stimulus in the presence of whichthe animal received more frequent reinforce-ment-in this case shock termination-than itreceived in its absence. But in the absence ofthe shock, failure to receive shock terminationfor the lever press is not properly considered alower frequency of reinforcement. Unless theshock is on, shock termination is not behav-iorally functional as a form of reinforcement,and the fact that the lever does not producethis effect is irrelevant. The shock, in this sit-

151

Page 4: De Citit Establishing Operation. Michael. 1982[1]

JACK MICHAEL

uation, is functioning more like an establish-ing operation, such as food deprivation, thanlike an SD. It evokes the escape response be-cause it changes what functions as reinforce-ment rather than because it is correlated witha higher frequency of reinforcement. It wouldbe quite possible, of course, to contrive aproper discriminative stimulus in the escapesituation. Let the escape response terminateshock only when a tone is sounding; the shockremains on irrespective of the animal's behav-ior when the tone is not sounding. Lever press-ing is clearly unreinforced when the tone isoff, and the tone is thus clearly an SD for leverpressing.

In summary, we could improve our verbalbehavior about behavior if we could identifyall environmental operations which alter theeffectiveness of events as reinforcement withthe same term, and especially if this term hasalso been explicitly linked to the evocative ef-fects of such operations. "Establishing opera-tion" might very well accomplish these pur-poses.

THE ESTABLISHING STIMULUSOR SE

Establishing Conditioned ReinforcementMost of the establishing operations discussed

so far have been the kind that alter the effec-tiveness of stimulus changes that can be classi-fied as unconditioned reinforcement. Stimuluschanges identified as conditioned reinforce-ment are also established as such by variousoperations. The most obvious are the sameoperations that establish the effectiveness ofthe relevant unconditioned reinforcement. Alight correlated with food becomes effectiveconditioned reinforcement as a function offood deprivation. Information about the loca-tion of a restaurant becomes reinforcing whenfood becomes reinforcing. There is, however, acommon situation in which a stimulus changeestablishes another stimulus change as con-ditioned reinforcement without altering theeffectiveness of the relevant unconditionedreinforcement. If the behavior which haspreviously obtained such conditioned rein-forcement now becomes strong we have anevocative relation like that produced by an es-tablishing operation but where the effect de-pends upon an organism's individual historyrather than the history of the species. I would

like to suggest the term "establishing stimulus"and SE for this relation.

General Conditions for theEstablishing StimulusThe circumstances for an establishing stim-

ulus (Figure 1) involve a stimulus change, Si,which functions as a discriminative stimulusfor a response, R1, but under circumstanceswhere that response cannot be executed orcannot be reinforced until another stimuluschange, So, takes place. This second stimuluschange, then, becomes effective as conditionedreinforcement, and the behavior that has inthe past achieved this second stimulus change,R9, is evoked. S1, then, is an SD for R1, butan SE for R2.

A Human ExampleSuppose that an electrician is prying a face

plate off a piece of equipment which must beremoved from the wall. The electrician's assis-tant is nearby with the tool box. The removalof the face plate reveals that the equipment isfastened to the wall with a slotted screw. Wecan consider the slotted screw under the pres-ent circumstances to be a discriminative stim-ulus (Si) for the use of a screw driver in remov-ing the screw (R1). (The reinforcement of thisbehavior is, of course, related to the electri-cian's job. When the wall fixture is removedand a new one applied, payment for the jobmay become available, etc.) But removing thescrew is not possible without an appropriate

BUT Rl CANNOT OCCURS"CCESSFULLY WITHOUT S2

EVOKES R1 ' R I

(sD)

SltESTABLISHES AS Sr S2

(sE)EVOKES

>R2Fig. 1. General condition for an establishing stimu-

lus. S1, functioning as an SD evokes R, but this responsecannot occur or cannot be reinforced without the pres-ence of S. Thus S, also functions as an SE, establishingS2 as a form of conditioned reinforcement and at thesame time evoking R& which has previously producedS2.

152

Page 5: De Citit Establishing Operation. Michael. 1982[1]

DISCRIMINATIVE AND MOTIVATIONAL FUNCTIONS OF STIMULI

screw driver (S2). So the electrician turns tothe assistant and says "screw driver" (R2). It isthe occurrence of this response which illus-trates a new type of evocative effect.

It is reasonable to consider the offered screwdriver as the reinforcement of the request, andthe history of such reinforcement to be thebasis for the occurrence of this request underthe present circumstances. It might also seemreasonable to consider the sight of the slottedscrew as an SD for this response, but here weshould be cautious. If it is proper to restrictthe definition of the SD to a stimulus in thepresence of which the relevant behavior hasbeen more frequently reinforced, then thepresent example does not qualify. A slottedscrew is not a stimulus that is correlated withan increased frequency of obtaining screwdrivers. Electricians' assistants generally pro-vide requested tools irrespective of the use towhich they will be put. The presence and at-tention of the assistant are SDS correlated withsuccessful asking, but not the slotted screw.However, it is an SD for unscrewing responses-a stimulus in the presence of which unscrew-ing responses (with the proper tool) are corre-lated with more frequent screw removal-butnot for asking for screw drivers. The evocativeeffect of the slotted screw on asking behavioris more like the evocative effect of an establish-ing operation than that of an SD, except for itsdependence on the organism's individual his-tory. The slotted screw is better considered anestablishing stimulus for asking, not a discrim-inative stimulus.

An Animal AnalogueIt is not difficult to describe an SE situation

in the context of an animal experiment, butfirst it is necessary to describe two secondaryfeatures of the human situation giving rise tothis concept. First, it must be possible to pro-duce the second stimulus change (S2) at anytime and not just after the first stimulus change(S1). Otherwise we are dealing with simplechaining. Furthermore, the second stimuluschange should not be one which once achievedremains effective indefinitely with no furthereffect or cost to the organism, or it will becomea standard form of preparatory behavior. Inthe electrician's situation, if there were sometool that was used on a high proportion ofactivities it would be kept always available.On the other hand, to ask for a tool when it is

not needed would result in the work area beingcluttered with unneeded tools (assume a smallor crowded work area, etc.). This second fea-ture would seem usually to involve some formof punishment for R2 that prevents it until Sihas made it necessary.Now, for the animal analogue consider a

food-deprived monkey in a chamber with achain hanging from the ceiling and a retract-able lever. Pulling the chain moves the leverinto the chamber. Pressing the lever has noeffect unless a light on the wall is on, in whichcase a lever press dispenses a food pellet. Toprevent the chain pull from functioning as astandard preparatory component of the behav-ioral sequence, we could require that the chainbe held in a pulled condition, or we could ar-range that each chain pull makes the leveravailable for only a limited period, say fiveseconds. In either case we would expect a well-trained monkey ultimately to display the fol-lowing repertoire: while the wall light is off(before the' electrician has seen the slottedscrew), the chain pull does not occur (a screwdriver is not requested), even though it wouldproduce the lever (even though the assistantwould provide one). When the light comeson (when the slotted screw is observed), themonkey pulls the chain (the electrician re-quests the screw driver) and then presses thelever (and then unscrews the screw) and eatsthe food pellet that is delivered (and removesthe piece of equipment, finishes the job, etc.).Returning to Figure 1, S, is the onset of the

wall light, which evokes lever pressing (R1).Lever pressing, however, cannot occur withoutthe lever (S2), and thus the availability of thelever becomes an effective form of reinforce-ment once the wall light is on. The chain pullis R9, evoked by the light functioning as anSE rather than as an SD because the light is notcorrelated with more frequent lever availabil-ity (the reinforcement of the chain pull) butrather with greater effectiveness of the lever asa form of reinforcement.

ALTERNATIVE SOLUTIONS

Larger Units of BehaviorIt may seem reasonable to consider the re-

sponse evoked by the SE to be simply an ele-ment in a chain of responses evoked by an SD.Thus, with the electrician, asking for the screw-driver might be interpreted as a part of a

153

Page 6: De Citit Establishing Operation. Michael. 1982[1]

154 JACK MICHAEL

larger unit of behavior evoked by the slottedscrew and reinforced by successful removal ofthe screw. From this point of view, the SE con-cept might seem unnecessary. That is, theslotted screw could be considered a discrimi-native stimulus for the larger unit, which ismore likely to achieve the removal of the screwin the presence of a slotted screw than in thepresence of a wing nut, a hex nut, etc. Thisanalysis is not too plausible, however, since itwould imply that requests are not sensitive totheir own immediate consequences but only tothe more remote events related to the use ofthe requested item. But even if this were true,we would still have to account for the forma-tion or acquisition of the large unit of behav-ior. This has usually involved reference to re-peated occurrence of chains of smaller units(for example, Skinner, 1938, 52ff and 102ff;Keller 8c Schoenfeld, 1950, Chapter 7; Millen-son, 1967, Chapter 12), and the initial elementof this particular chain would require the SEconcept. That is, we might be able to do with-out the SE in the analysis of the current func-tion of a large unit but would need it to ac-count for the first element of the chain ofsmaller units out of which the large unit wasformed.

Conditional Conditioned ReinforcementThe notion that a form of conditioned rein-

forcement may be conditional upon the pres-ence of another stimulus condition is quitereasonable and requires no new terminology.This could be referred to as conditional con-ditioned reinforcement, and the SE of the pre-vious sections can be seen to be the type ofstimulus upon which such conditioned rein-forcement is conditional. This general ap-proachi, however, fails to implicate the evoca-tive effect which is the main topic of thepresent paper and thus seems less satisfactorythan the new terminology.

Retaining the SD by Complicatingthe Reinforcement

If we consider the chain pull to be rein-forced, not by the lever insertion into thechamber but by the more complex stimuluschange from light on with lever out to light onwith lever in, we may be able to retain the no-tion of the light-onset as an SD for the chainpull, because the more complex stimulus

change can only be produced when the lightis on. This involves adding a static componentto our reinforcing stimulus change, which maybe theoretically sound, but is certainly not theusual way we talk about reinforcement. In thehuman example this would mean that thescrew driver does not reinforce the request, butrather the change from looking at a slottedscrew without a screw driver in hand to look-ing at one with a screw driver in hand. As withconditional conditioned reinforcement this isnot an issue regarding the facts of behavior,but rather our verbal behavior concerningthese facts. Will we be more effective in intel-lectual and practical ways by introducing anew stimulus function and retaining a simpleform of verbal behavior about reinforcement,or will we be better off retaining the SD inter-pretation for both types of evocation but com-plicating our interpretation of reinforcementfor one of them? I clearly favor the former.

SUMMARYIn everyday language we can and often do

distinguish between changing people's behav-ior by changing what they want and changingtheir behavior by changing their chances ofgetting something that they already want. Ourtechnical terminology also makes such a dis-tinction, but only in the case of establishingoperations such as deprivation and those kindsof reinforcing events called "unconditioned."Much more common are those stimuluschanges which alter the reinforcing effective-ness of events ordinarily referred to as condi-tioned reinforcement, and which evoke thebehavior that has previously produced this re-inforcement. We do not have a convenientway of referring to such stimulus changes, andbecause of this they may be subsumed underthe heading of discriminative stimuli. I havesuggested the term "establishing stimulus" forsuch events, thus linking them with establish-ing operations such as deprivation, and I hope,suggesting the relation to the individual's his-tory by the replacement of "operation" with"stimulus."

REFERENCESKeller, F. S., & Schoenfeld, W. N. Principles of psychol-

ogy. New York: Appleton-Century-Crofts, 1950.

Page 7: De Citit Establishing Operation. Michael. 1982[1]

DISCRIMINATIVE AND MOTIVATIONAL FUNCTIONS OF STIMULI 155

Michael, J. L. Positive and negative reinforcement, adistinction that is no longer necessary; or a betterway to talk about bad things. Behaviorism, 1975, 3,33-44.

Millenson, J. R. Principles of behavioral analysis. NewYork: Macmillan, 1967.

Skinner, B. F. The behavior of organisms. New York:Appleton-Century-Crofts, 1938.

Skinner, B. F. Science and human behavior. NewYork: Macmillan, 1953.

Skinner, B. F. Verbal behavior. New York: Appleton-Century-Crofts, 1957.

Received September 22, 1980Final acceptance August 31, 1981