|LIST OF EXAMPLES OF NEGATIVE REINFORCEMENT AND OF PARTIAL SCHEDULES OF REINFORCEMENT TAKEN FROM VARIOUS TEXTBOOKS (These were taken out of textbooks by Miguel Roig and his collegue Carolyn Vigorito and posted on the web by Jeff Bartel from Shippensburg University, Pennsylvania, modified from http://www.ship.edu/~jsbart/psy101/learning_supplement.html.)
A lot of students are confused about negative reinforcement. What's the difference between that and punishment? Perhaps some examples of negative reinforcement would be helpful (remember, it's "reinforcement" so the behavior increases, and because it's "negative," the reinforcer is removed after the response).
Loud buzz in some cars when ignition key is turned on; driver must put on safety belt in order to eliminate irritating buzz (Gredler, 1992) the buzz is a negative reinforcer for putting on the seat-belt.
Feigning a stomach ache in order to avoiding school (Gredler, 1992) school as negative reinforcer for feigning stomach aches.
Rushing home in the winter to get out of the cold (Weiten, 1992). Fanning oneself to escape from the heat (Zimbardo, 1992). Cold weather as negative reinforcer for walking home (the colder the faster you walk..), and heat sa negative reinforcer for fanning.
Cleaning the house to get rid of disgusting mess (Weiten, 1992), or cleaning the house to get rid of your mother's nagging (Bootzin, et al , 1991; Leahy & Harris, 1989). Nagging/Mess as negative reinforcer to cleaning.
Studying for an exam to avoid getting a poor grade (Bootzin & Acocella, 1980). Low grade as a negative reinforcer for studying (but.. a high grade is a positive reinforcer for studying at the same time)
Taking aspirin to relieve headache (Bootzin & Acocella, 1980; Buskist & Gerbing, 1990; Gerow, 1992). Good example: headache as negative reinforcer to taking medication.
Removing a stone that has lodged inside the shoe while walking (Pettijohn, 1992; Roediger, Capaldi, Paris, & Polivy, 1991). Pain as negative reinforcer to stopping to take off your shoe..
Prisoners try to break out of jail to escape the aversiveness of being locked up (Domjan & Burkhard, 1993).
Leaving a movie theater if the movie is bad (Domjan & Burkhard, 1993).
Running from the building when the fire alarm sounds (Domjan & Burkhard, 1993). Fire alarm as negative reinforcer for leaving building.
Smoking in order to reduce a negative emotional state (Baron, 1992). Negative emotional state as negative reinforcer to smoking.
Turning down the volume of a very loud radio (Roediger, Capaldi, Paris, & Polivy, 1991).
Changes in sexual behavior (e.g., wearing condoms) to avoid AIDS (Gerow, 1992).
Another issue of confusion is that of the reinforcement schedule. Again, here is a list compiled by Roig and Vigorito.
Frequent flyer program: getting a free flight after acumulating x number of flight miles.
Factory worker paid on piece work (Bernstein, Roy, Srull, & Wickens, 1991; Bootzin, Bower, Crocker, & Hall, 1991). Paying on commission (Gredler, 1992) or getting a bonus for every x number of items sold (Weiten, 1992).
Mailman must visit the same number of mail boxes each day in order to go home (Domjan & Burkhard, 1993).
Going up a staircase, you must go up the same number of stairs to get to the landing (Domjan & Burkhard, 1993).
Teenager is paid by the job (e.g., amount of work completed) will probably mow more lawns than one who is paid by the hour.
Doing 20 situps to keep fit (Roediger, Capaldi, Paris, & Polivy, 1991).
Slot machines at a gambling casino (Baron, 1992; Bernstein, Roy, Srull, & Wickens, 1991; Carlson, 1990; Crooks & Stein, 1991; Gerow, 1992)
Using drugs to escape withdrawal symptoms (Gredler, 1992)
Fly fishing: casting and reeling back several times before catching a fish (Bootzin, Bower, Crocker, & Hall, 1991; Weiten, 1992).
Signaling while hitchiking (Bootzin, Bower, Crocker, & Hall, 1991).
Buying lottery tickets (Pettijohn, 1992).
Sports games: e.g., variable number of strokes to finish a hole of golf (Baron, 1992); variable number of swings to hit the baseball; variable number of throws to get the basketball in the hoop; variable number of throws to get a strike in bowling (Domjan & Burkhard, 1993).
Each time a custodian cleans a room, a certain amount of cleaning will be necessary, however, the amount varies from day to day and even room to room (Domjan & Burkhar, 1993).
Playing Bingo (Gray, 1991).
Getting a paycheck at the end of the week (Baron, 1992; Bernstein, Roy, Srull, & Wickens, 1991; Leahy & Harris, 1989; McConnell, 1989)
Looking at your watch during a lecture until end of a lecture (Catania, 1992).
Bill-passing behavior on the part of congress. This behavior has been shown to increase as the recess period approaches (Weisberg & Waldrop as cited in Houston, 1976).
Checking oven to see if cookies are done, when cooking time is known (Gray, 1991).
Going to the cafeteria to see if the next meal is already available (Domjan & Burkhard, 1993).
Picking up the paper in the morning after it has been delivered at the same time every day (Peterson, 1991).
VARIABLE INTERVAL SCHEDULE
Surprise quizzes (Carlson, 1990; Gerow, 1992; Gleitman, 1981; Pettijohn, 1992; Rathus, 1990).
Speed traps on highway (Gleitman, 1981)
Calling a friend and getting no answer or getting a busy signal because he is always on the phone. Some variable time will elapse until the call is reinforced by an answer (Bootzin, Bower, Crocker, & Hall, 1991; Crooks & Stein, 1991; Catania, 1992; Gray, 1991; Peterson, 1991; Pettijohn, 1992)
Fishing: a fish may be caught at intervals of approximately every two minutes; every hour; or every two days! (Carlson, 1990; Crooks & Stein, 1991; Houston, 1976)
Mail-checking behavior assuming that mailperson comes at irregular intervals (Myers, 1992) (or in email!).
Waiting for a taxi cab.
Random drug testing; worker refrains from taking drugs (Baron,1992).