YOU ARE DOWNLOADING DOCUMENT

Please tick the box to continue:

Transcript
Page 1: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

Operant ConditioningSchedules of reinforcement

Page 2: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

Schedules of ReinforcementContinuous

reinforcement refers to reinforcement being administered to each instance of a response

Partial reinforcement lies between continuous reinforcement and extinction

Page 3: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

An Example of Continuous Reinforcement

Each instance of a smile is reinforced

Page 4: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

Schedules of ReinforcementContinuous

Reinforcement – A schedule of reinforcement in which every correct response is reinforced.

Partial Reinforcement – One of several reinforcement schedules in which not every correct response is reinforced.

Which method do you think is used more in real life?

Page 5: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

Schedules of ReinforcementRatio Version –

having to do with instances of the behavior.

Ex. – Reinforce or reward the behavior after a set number or x many times that an action or behavior is demonstrated.

Interval Version – having to do with the passage of time.

Ex. – Reinforce the participant after a set number or x period of time that the behavior is displayed.

Page 6: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

When attempting to find out the Interval/Ratio of an example:

Always ask: Does it deal with time?

If Yes=IntervalIf No=Ratio

Page 7: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

Fixed-Interval ScheduleFixed-interval schedule – A schedule in

which a fixed amount of time must elapse between the previous and subsequent times that reinforcement will occur.

Ex. Johnny gets five M & M’s every 5 minutes he sits in his seat quietly. (Think Pre-school)

Page 8: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

Fixed interval schedule is when the reinforcement is received after a fixed amount of time has passed. Ex. You get allowance every other Friday.

Page 9: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

Variable interval schedule is when the reinforcement occurs after varying amounts of time. Ex. Fishing and catching a fish after varying amounts of time

Page 10: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

Variable-Ratio Schedule Variable-ratio Schedule – A schedule in

which reinforcement is provided after a variable number of correct responses.

Produce an overall high consistent rate of responding.

Ex. – On average, the rat press the bar 5 times for one pellet of food.

Page 11: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

Variable ratios schedule is when an unpredictable number of responses are required before reinforcement can be obtained. Ex. slot machines.

Page 12: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

Fixed ratio schedule a specific number of correct responses is required before reinforcement can be obtained. Ex. Buy 10 haircuts get 1 free.

Page 13: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.
Page 14: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

Comparisons of Schedules of Reinforcement

Fixed interval Reward on fixed time basis

Leads to average and irregular performance

Fast extinction of behavior

Fixed ratio

Variable ratio

Variable interval

Reward tied to specific number of responses

Leads quickly to very high and stable performance

Moderately fast extinction of behavior

SCHEDULEFORM OF REWARD

Reward given after varying periods of time

Leads to moderately high and stable performance

Slow extinction of behavior

Reward given for some behaviors

Leads to very high performance

Very slow extinction of behavior

INFLUENCE ON PERFORMANCE

EFFECTS ON BEHAVIOR

Page 15: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

FI, VI, FR, or VR?1. When I bake cookies, I can only put one set in at a

time, so after 10 minutes my first set of cookies is done. After another ten minutes, my second set of cookies is done. I get to eat a cookie after each set is done baking.

2. After every 10 math problems that I complete, I allow myself a 5 minute break.

3. I look over my notes every night in math because I never know when my teacher will give a pop quiz.

4. When hunting season comes around, sometimes I’ll spend all day sitting in the woods waiting to get a shot at a big buck. It’s worth it though when I get a nice 10 pt buck.

5. Today in Psychology class we were talking about Schedules of Reinforcement and everyone was eagerly raising their hands and participating. Miranda raised her hand a couple of times and was eventually called on.

1. FI

2. FR

3. VI

4. VI

5. VR

Page 16: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

FI, VI, FR, or VR?6. Madison spanks her son if she has to ask him

three times to clean up his room.7. Emily has a spelling test every Friday. She

usually does well and gets a star sticker.8. Steve’s a big gambling man. He plays the slot

machines all day hoping for a big win.9. Snakes get hungry at certain times of the day.

They might watch any number of prey go by before they decide to strike.

10. Mr. Bertani receives a salary paycheck every 2 weeks. (Miss Suter doesn’t ).

11. Christina works at a tanning salon. For every 2 bottles of lotion she sells, she gets 1 dollar in commission.

12. Mike is trying to study for his upcoming Psychology quiz. He reads five pages, then takes a break. He resumes reading and takes another break after he has completed 5 more pages.

6. FR7. FI

8. VR

9. VI

10. FI

11. FR

12. FR

Page 17: Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

FI, VI, FR, or VR?13. Megan is fundraising to try to raise money so

she can go on the annual band trip. She goes door to door in her neighborhood trying to sell popcorn tins. She eventually sells some.

14. Kylie is a business girl who works in the big city. Her boss is busy, so he only checks her work periodically.

15. Mark is a lawyer who owns his own practice. His customers makes payments at irregular times.

16. Jessica is a dental assistant and gets a raise every year at the same time and never in between.

17. Andrew works at a GM factory and is in charge of attaching 3 parts. After he gets his parts attached, he gets some free time before the next car moves down the line.

18. Brittany is a telemarketer trying to sell life insurance. After so many calls, someone will eventually buy.

13. VR

14. VI

15. VI16. FI

17. FR

18. VR


Related Documents