compound schedule | A complex contingency where two or more schedules of reinforcement are combined.
|
 |
 |
 |
contingency | The specified relationship between a specific behavior and reinforcement.
|
 |
 |
 |
contingency management | The use of contingent reinforcement and nonreinforcement to increase the frequency of appropriate behavior and eliminate inappropriate behaviors.
|
 |
 |
 |
depression effect | An effect in which a shift from high to low reward magnitude produces a lower level of responding than if the reward magnitude had always been low.
|
 |
 |
 |
differential reinforcement of high responding (DRH) schedule | A schedule of reinforcement in which a specific high number of responses must occur with a specified time in order for reinforcement to occur.
|
 |
 |
 |
differential reinforcement of low responding (DRL) schedule | A schedule of reinforcement in which a certain amount of time must elapse without responding, with reinforcement following the first response after the interval.
|
 |
 |
 |
differential reinforcement of other behaviors (DRO) | A schedule of reinforcement in which the absence of a specific response within a specified time leads to reinforcement.
|
 |
 |
 |
differential reinforcement schedule | A schedule of reinforcement in which a specific number of behaviors must occur within a specified time in order for reinforcement to occur.
|
 |
 |
 |
elation effect | An effect in which a shift from low to high reward magnitude produces a greater level of responding than if the reward magnitude had always been high.
|
 |
 |
 |
extinction | The elimination or suppression of a response caused by the discontinuation of reinforcement or the removal of the unconditioned stimulus.
|
 |
 |
 |
fixed-interval schedule (FI) | A contingency in which reinforcement is available only after a specified period of time, and the first response emitted after the interval has elapsed is reinforced.
|
 |
 |
 |
fixed-ratio schedule (FR) | A contingency in which a specific number of responses is needed to produce reinforcement.
|
 |
 |
 |
instrumental conditioning procedure | A conditioning procedure in which the environment constrains the opportunity for reward and a specific behavior can obtain reward.
|
 |
 |
 |
interval schedule of reinforcement | A contingency that specifies that reinforcement becomes available at a certain period of time after the last reinforcement.
|
 |
 |
 |
negative contrast effect | An effect in which a shift from high to low reward magnitude produces a lower level of responding than if the reward magnitude had always been low.
|
 |
 |
 |
negative reinforcer | The termination of an aversive event, which reinforces the behavior that terminated the aversive event.
|
 |
 |
 |
operant chamber | An apparatus that creates an enclosed environment, used for the study of operant behavior within it.
|
 |
 |
 |
operant conditioning | When a specific response produces reinforcement, and the frequency of the response determines the amount of reinforcement obtained.
|
 |
 |
 |
partial reinforcement effect (PRE) | The greater resistance to extinction of an instrumental or operant response following intermittent rather than continuous reinforcement during acquisition.
|
 |
 |
 |
positive contrast effect | An effect in which a shift from low to high reward magnitude produces a greater level of responding than if the reward magnitude had always been high.
|
 |
 |
 |
positive reinforcer | An event whose occurrence increases the frequency of the behavior that precedes it.
|
 |
 |
 |
postreinforcement pause | A cessation of behavior following reinforcement on a ratio schedule, which is followed by resumption of responding at the intensity characteristic of that ratio schedule.
|
 |
 |
 |
primary reinforcer | An activity whose reinforcing properties are innate.
|
 |
 |
 |
ratio schedule of reinforcement | A contingency that specifies that a certain number of behaviors are necessary to produce reinforcement.
|
 |
 |
 |
reinforcer | An event (or termination of an event) that increases the frequency of the behavior that preceded it.
|
 |
 |
 |
scallop effect | A pattern of behavior characteristic of a fixed-interval schedule, where responding stops after reinforcement and then slowly increases as the time approaches when reinforcement will be available.
|
 |
 |
 |
schedule of reinforcement | A contingency that specifies how often or when we must act to receive reinforcement.
|
 |
 |
 |
secondary reinforcer | An activity that has developed its reinforcing properties through its association with primary reinforcers.
|
 |
 |
 |
shaping | A technique of acquiring a desired behavior by first selecting a highly occurring operant behavior, then slowly changing the contingency until the desired behavior is learned.
|
 |
 |
 |
spontaneous recovery | The return of a CR when an interval intervenes between extinction and testing without additional CS-UCS pairings, or when the instrumental or operant response returns without additional reinforced experience.
|
 |
 |
 |
successive approximation procedure | A technique for acquiring a desired behavior by first selecting a behavior with a high operant rate, then slowly changing the contingency until the desired behavior is learned.
|
 |
 |
 |
variable-interval schedule (VI) | A contingency in which there is an average interval of time between available reinforcements, but the interval varies from one reinforcement to the next contingency.
|
 |
 |
 |
variable-ratio schedule (VR) | A contingency in which an average number of behaviors produces reinforcement, but the actual number of responses required to produce reinforcement varies over the course of training.
|