McGraw-Hill OnlineMcGraw-Hill Higher EducationLearning Center
Student Center | Instructor Center | Information Center | Home
Internet Connections
Chapter Outline
Multiple Choice Quiz
Fill in the Blanks
True or False
Glossary
Internet Exercises
Feedback
Help Center


Learning: Principles and Applications, 4/e
Stephen B Klein, Mississippi State University

Biological Influences On Learning

Chapter Outline


Chapter Outline

  1. GENERALITY OF THE LAWS OF LEARNING

    Many psychologists believe that there are general laws that govern learning. These general laws are revealed in classical and operant conditioning. Both Pavlov and Skinner supported the claim that general laws of learning can be discovered through laboratory experiments.

  2. A BEHAVIOR SYSTEMS APPROACH

    The behavior systems approach, developed by Timberlake, is based upon the assumption that learning evolved as a modifier of innate behavioral systems. Therefore, learning allows behaviors to be flexible in order to meet the demands of the environment. Because species occupy different environmental niches, it is suggested that what can be learned will vary from species to species. Variations result from either a predisposition or a constraint.

    A predisposition describes a situation in which a certain type of learning occurs quite readily in a given species. A constraint refers to less rapid learning by a particular species as compared to other species.

  3. ANIMAL MISBEHAVIOR

    Breland and Breland (1961; 1966) trained a variety of species to engage in a wide variety of responses. However, the Brelands observed that some operant responses, although initially performed efficiently, deteriorated with continued practice and reinforcement. The Brelands assumed that practice allowed the opportunity for the elicitation of instinctive reactions which were then reinforced by food. The Brelands called the decrese in operant responding with continued reinforcement instinctive drift. The Brelands also called the instinctive behavior that interfered with operant responding animal misbehavior.

    Based on their experimental work, Timberlake and colleagues proposed the appetitive structure view. This hypothesis proposes that animal misbehavior involves both Pavlovian and operant conditioning, and represents species-typical foraging and food-handling behaviors that are elicited by pairing food with the natural cues that control feeding.

  4. SCHEDULE-INDUCED BEHAVIOR

    Skinner reported that pigeons engage in ritualistic behavioral patterns (schedule-induced behavior) while performing a key peck response for food on a fixed-interval schedule. Once a particular pattern of behavior emerged, the pigeons repeatedly exhibited it with increasing strength as training continued. Skinner called this behavior on the fixed-interval schedule superstitious behavior.

    Since Skinner's initial observation, Staddon & Simmelhag argued that pigeons show superstitious behavior because food is highly predictable. The birds engage in termimal behaviors that are reinforced by food, but also show a wide variety of interim behaviors that are not contiguous with reinforcement. The occurrence of a high level of interim behaviors with fixed-interval schedules is referred to as adjunctive behavior.

    1. Schedule-Induced Polydipsia: The most extensively studied form of adjunctive behavior is the excessive intake of water (polydipsia) when rats are reinforced with food on a fixed-interval schedule. Schedule-induced polydipsia has been observed in a wide variety of fixed-interval schedules.

      Several factors contribute to the amount of schedule-induced polydipsia. First, the level of water intake increases as body weight decreases. Second, when the available fluid is preferred, the amount of polydipsia increases. Finally, the length of time between reinforcements and the level of polydipsia is described by an inverted U-shaped function.

    2. Other Schedule-Induced Behaviors: Several other instinctive behaviors have been studied using interval schedules. These behaviors include schedule-induced wheel running and schedule-induced aggression.

    3. The Nature of Schedule-Induced Behavior: Riley and Wetherington (1989) proposed that schedule-induced behavior is an instinctive reaction elicited by periodic deliveries of reinforcement.

    4. Does Schedule-Induced Behavior Occur in Humans?: Although some forms of schedule-induced behavior occur in humans, there are important differences between human and nonhuman research. Basically, in contrast to the pronounced display of schedule-induced behaviors in animals, humans show weak and variable forms of such behaviors. The basis for the differences between humans and animals regarding schedule-induced behavior is unknown.

  5. FLAVOR AVERSION LEARNING

    Flavor-aversion learning is based upon an association between a flavor with illness over a long CS-UCS interval. Therefore, flavor-aversion learning represents an example of long-delay learning.

    1. The Selectivity of Flavor Aversion Learning: Seligman (1970) proposed that rats have an evolutionary preparedness to associate tastes with illness given that flavor aversion learning occurs after one conditioning trial.

    2. Flavor Aversion Learning in Humans: Humans also learn flavor aversions.

    3. Nature of Flavor Aversion Learning: Two proposals, the learned safety hypothesis and the concurrent interference theory, have been developed to explain flavor-aversion learning. Research indicates that both proposals are relevant to the explanation of flavor-aversion learning.

      1. Learned-Safety Theory

        Kalat and Rozin's (1971) learned safety theory suggests that an unique learning process is responsible for flavor-aversion learning. Animals experience ingestional neophobia when given novel food. If illness does not follow the ingestion of a new food source, then the animal learns that the novel food is safe. Therefore, learned safety counters the natural reluctance to consume new foods.

      2. Concurrent-Interference View

        Revusky (1971) proposed a concurrent interference view as another explanation for flavor-aversion learning. According to Revusky, flavor-aversion learning is a form of long-delay conditioning in which there is minimum opportunity for CSs other than taste to become associated to illness. Therefore, long delay learning prohibits intervening CSs from interfering with the conditioning of illness to taste.

  6. IMPRINTING

    1. Infant Love: Lorenz (1952) observed the social attachment process (imprinting) in animals and discovered that a newly hatched bird will approach, follow, and form a social attachment to the first moving object it encounters.

      Harlow (1971) described an attachment process occurring in infant monkeys who prefer soft terry cloth surrogate mothers over a wire-mesh surrogate. Ainsworth (1977) also observed the attachment process in human infants who prefer responsive and sensitive mothers.

      Critical or sensitive periods are important in imprinting. The sensitive period is a developmental stage when there is a greater likelihood of forming an attachment. However, imprinting can still occur after the sensitive period has passed if extended training is given.

    2. Other Examples of Imprinting: Sexual and food preferences represent two other examples of imprinting.

      1. Sexual Preference

        The eventual sexual preference of many birds is established during a sensitive period before the birds are sexually mature.

      2. Food Preference

        Sensitive periods exist for the establishment of food preferences in birds. Human food preferences may also result from experiences with various nutrients during a sensitive period from ages 6 to 1

    3. Nature of Imprinting: Imprinting has been explained by associative learning and inherited programming hypotheses.

      1. An Associative Learning View

        Moltz (1960, 1963) offered the associative learning view to explain imprinting in birds. Early in life large objects, such as mother, attract the chick's attention, and it orients toward these objects. When the chick is older, its fear system becomes established. Now unfamiliar objects elicit fear but familiar objects remove fear because they are associated with low levels of arousal. The presence of familiar objects produces relief and the chick is reinforced for moving closer to these objects.

        The results of Harlow's classic studies of monkeys raised with surrogate mothers confirm that similar emotional imprinting processes occur in primates.

        Ainsworth and associates (1977, 1979, 1982) reported a similar attraction to security in human infants. A secure relationship between infant and mother is observed when the mother is sensitive and responds to her infant. However, an anxious relationship was also found when the mother acts indifferently to her infant.

      2. An Instinctive View of Imprinting

        Lorenz (1935) characterized imprinting as a genetically programmed form of learning that has an important adaptive function. This is an instinctive view of imprinting.

        Imprinting clearly differs from other forms of associative learning. For example, imprinting occurs to certain objects more readily than to others.

  7. THE AVOIDANCE OF AVERSIVE EVENTS

    1. Species-Specific Defense Reactions: Bolles (1970, 1978) proposed that animals have species-specific defense reactions (SSDR) that assist in the avoidance or escape from dangerous events. These reactions, with an innate basis, are elicited by signals of danger and allow for the avoidance of adversity. The instinctive reactions that allow animals to avoid aversive situations differ and are determined by the evolutionary history of a species.

    2. Predispositions and Avoidance Learning: Bolles (1978) assumed that Pavlovian conditioning rather than operant conditioning is responsible for avoidance learning. According to Bolles, the association of environmental stimuli with aversive events is responsible for the development of avoidance behavior.

  8. THE BIOLOGY OF REINFORCEMENT AND PUNISHMENT

    1. Electrical Stimulation of the Brain: Olds and Milner (1954) discovered that the electrical stimulation of certain areas of the brain serves as a positive reinforcement while stimulation of other areas produces punishment effects. Olds and Milner also found that rats would learn a bar press response when it was followed by brain stimulation. Thus electrical stimulation of the brain (ESB), also known as intracranial self-stimulation (ICSS), has a powerful reinforcing effect.

    2. Anatomical Location of Reinforcement and Punishment: Stein and associates (1969) have presented evidence that a structure in the brain's limbic system, known as the medial forebrain bundle (MFB), controls the effects of positive reinforcement. Another limbic system area, the periventricular tract (PVT), represents the brain's punishment area.

    3. The Influence of the Medial Forebrain Bundle:

      1. The Reinforcing Effect of MFB Stimulation Stimulation of the MFB produces a very strong positive reinforcement effect. ESB has a more powerful effect on behavior than do conventional reinforcers such as food, water, and sex. That ESB produces pleasurable feelings has been documented by human research.

      2. The Motivational Influence of MFB Stimulation

        ESB will motivate eating if food is available or drinking when water is present. This phenomenon is called stimulus-bound behavior to indicate that the environmental stimulus determines the action that will be motivated by brain stimulation.

      3. The Influence of Reinforcers on MFB Function

        Several studies indicate that the presence of reinforcement increases the effect of MFB activity. Moreover, a preferred reinforcer increases the value of brain stimulation more than does a less preferred reinforcer.

      4. The Influence of Deprivation on the MFB

        Drive increases the value of reinforcers. This effect is probably due to an increased response of the MFB when reinforcers reduce high drive states. Research has established that increases in drive states, such as hunger and thirst, lead to higher operant levels for ESB.

    4. Mesotelencephalic Reinforcement System: The brain's reinforcement system is the mesotelencephalic reinforcement system, which is responsible for the effects of reinforcement and contains two important paths. One is the tegmentostriatal pathway, which includes the MFB. A second part of the mesotelencephalic reinforcement system is the nigrostriatal pathway.

      1. Function of the Two Reinforcement Systems

        The two pathways appear to regulate two different aspects of reinforcement. The tegmentostriatal pathway appears to be involved in motivation, and the nigrostriatal pathway may be involved in helping to consolidate memory.

      2. Dopaminergic Control of Reinforcement

        Dopamine is the neurotransmitter that has important functions regulating the behavioral effects of reinforcement. Dopamine governs activity of the ventral tegmental area (VTA). One indication of dopaminergic influence is the strong reinforcement effects of cocaine and amphetamine. Both drugs increase levels of dopamine in the mesotelencephalic reinforcement system. Another line of research shows that reinforcers initiate the release of dopamine in the nucleus accumbens (NA). Natural reinforcers also promote the same release of dopamine in the NA.

      3. Opiate Activation of the Tegmentostriatal Pathway

        Animals learn to self-administer opiate drugs (heroin, morphine) suggesting the reinforcing properties of these substances. The ability of opiates to serve as reinforcers appears to be due to opiate receptor sites located in the tegmentostriatal pathway. Thus, two neurochemical systems, one based upon dopamine and the other based upon natural opiates, regulate activity in the tegmentostriatal pathway, and both systems may activate the NA.

      4. Individual Differences in Mesotelencephalic Reinforcement System Functioning

        Recent evidence suggests that rats differ in how reactive they are to various environmental cues. These differences correlate with differences in mesotelencephalic activity.

    5. The Impact of the PVT Punishment System: ESB of the periventricular tract (PVT) appears to be aversive. Such stimulation elicits defensive reactions in animals. Furthermore, PVT stimulation inhibits reinforcer-seeking behavior. Animals learn behaviors that escape or avoid ESB of the PVT. Finally, the neurotransmitter, acetylcholine, appears to be an important regulator of PVT functioning.