You are on page 1of 6

The Basics

B.F. Skinner is perhaps the predominant figure in American psychology. He is an


experimental psychologist at Harvard who has developed behaviorism as a position in
learning (he remains hesitant to use the term theory).

Skinner emphasizes observable behavior in the study of humanshence the term


behaviorism. He rejects any attempt at introspection or use of hypothetical internal
processes or structures to account for learning. Instead, Skinner uses the
consequences of a behavior to explain why the behavior continues or fades.

Many of Skinners ideas are built upon Thorndikes law of effect. Stated briefly, Skinner
believes (or has been conditioned to say?) that behavior that is followed by
reinforcement (positive or negative) has an increased probability of reoccurrence.
Behavior followed by extinction or punishment has a decreased probability of re-
occurrence.

Since learning is implied by a change in behavior, a teacher must first determine what
behavioral change is desirable, then manipulate the consequences to alter the
probability of the behavior recurring. Through proper use of shaping, the teacher can
promote the development of new behaviors. In concept, this is quite simple. In practice,
it is a bit more difficult, but quite within grasp, as research and experience with
programmed instruction and behavior modification show.

Skinners ideas about instruction have been very influential on education. After a period of
almost total domination behaviorism is beginning to wane, yet its impact will continue to be felt.

B.F. Skinner's Theory

The Basics
B.F. Skinner is perhaps the predominant figure in American psychology. He is an
experimental psychologist at Harvard who has developed behaviorism as a position in
learning (he remains hesitant to use the term theory).

Skinner emphasizes observable behavior in the study of humanshence the term


behaviorism. He rejects any attempt at introspection or use of hypothetical internal
processes or structures to account for learning. Instead, Skinner uses the
consequences of a behavior to explain why the behavior continues or fades.

Many of Skinners ideas are built upon Thorndikes law of effect. Stated briefly, Skinner
believes (or has been conditioned to say?) that behavior that is followed by
reinforcement (positive or negative) has an increased probability of reoccurrence.
Behavior followed by extinction or punishment has a decreased probability of re-
occurrence.

Since learning is implied by a change in behavior, a teacher must first determine what
behavioral change is desirable, then manipulate the consequences to alter the
probability of the behavior recurring. Through proper use of shaping, the teacher can
promote the development of new behaviors. In concept, this is quite simple. In practice,
it is a bit more difficult, but quite within grasp, as research and experience with
programmed instruction and behavior modification show.

Skinners ideas about instruction have been very influential on education. After a period
of almost total domination behaviorism is beginning to wane, yet its impact will continue
to be felt.

Podcast Review

You can hear Dr. Hannum reviewing Skinner's theory by clicking the icon above. This
was recorded during a graduate seminar on learning theories. You can also Download
podcast.

View of Learning
Here is a comprehensive set of objectives for Skinner along with points based on these
objectives:

1. Describe the philosophical basis of Skinners theory.


Partially in reaction to the field of psychoanalysis and the work of people like Freud,
Skinner thought the best way to advance the field of psychology was through
application of the scientific method based on observable experiments not Speculation or
theoretical musings. Skinner held firm to the logical positive position that all we can
really know is that which we can learn through direct observation using our senses. He
was not inclined to speculate about things nor to hypothesize about why something
might have happened. He conducted experiments, observed, and recorded the results.
Nothing else. He did this because of his belief that the only stable knowledge comes
from direct observation, not from speculation about internal matters or things that are
not directly observable.

2. Describe the four consequences that alter behavior, giving definitions and
examples of each.

Consequence Definition Example


A behavior is followed by
Receive reinforcer
the presentation of a Giving students a gold star for
(positive
positive stimulus, thus completing work on time
reinforcement)
the behavior increases.
Putting on sunglasses to
A behavior is followed by remove the glare of the sun;
Remove unpleasant the removal of an allowing students to quit
stimulus (negative unpleasant stimulus, working problems that don't
reinforcement) thus the behavior interest them if they follow
increases. classroom rules about arriving
on time
A behavior is followed by
the presentation of an Spanking a child who
Receive unpleasant
unpleasant stimulus, misbehaves; assigning
stimulus
thus the behavior additional homework problems
(punishment)
decreases at least to a student who is disruptive
temporarily.
A behavior is followed by
Not allowing a student to go out
the withholding or
Withhold pleasant on the playground when he has
removal of a positive
stimulus (extinction) not completed his work as
stimulus, thus the
scheduled
behavior decreases.

3. Describe the dependent and independent variables involved in learning.


For Skinner, the dependant variable involved in learning is a change in behavior
because that's the only thing that is directly observable. More specifically it is the
frequency with which a specific behavior occurs that is the dependent variable.
Obviously Skinner would not award anything like "style points" to a behavior because of
the qualitative dimension to this. Rather he would specifically define and describe a
behavior then count the frequency with which it occurs following some event. The main
independent variable involved in learning for Skinner is whether that learning is
reinforced. More specifically, Skinner would say that the independent verbal influencing
learning is the consequence of the behavior. This could include any of the four
consequences described above.

4. Describe Skinners objection to other theories.


Skinner didn't object so much the other theories of learning because he thought they
were wrong but rather because he thought they had nothing to our understanding of
learning because they were not based on tangible, observable and repeatable factors.
Any theory that looked to an internal process to account for learning, such as
developmental and cognitive theories, were readily discounted by Skinner as being non-
scientific and thus of limited value. These theories did not generate substantial
knowledge that could be used to explain learning or two cause learning to happen
according to Skinner. Thus, such theories were not worthy of consideration.

5. Define shaping and describe how and why it works.


In Skinner's view of learning a person, or animal for that matter, must first emit or
demonstrate a specific behavior that is subsequently reinforced and thus becomes
learned. Reinforcement can follow only after the new behavior occurs. This raises the
question, "How can you get a new behavior to occur so you can reinforce it because it
to be learned?" The answer according to Skinner involves shaping. If a person is not
capable of the desired behavior then, of course, you can't reinforce the desired behavior
in that person. Instead through the process of shaping you begin by reinforcing any
approximation of the desired behavior. If a child can't pronounce the word you desire for
them to pronounce you begin by reinforcing any approximation of that word that they
can make. Then slowly you provide reinforcement only for those approximations that get
closer and closer to the actual pronunciation of the word. Gradually the pronunciation of
the word shaped until it becomes the correct pronunciation. Then the student is only
reinforced for the correct pronunciation. Shaping works according to behaviorism
because the desired behavior is reinforced and thus more likely to occur. By reinforcing
only closer approximations to desired behavior the student gradually learns how to emit
the desired behavior through this process of shaping.

6. Describe operant behavior, and contrast it with respondent behavior.


Operant behavior, which is fundamental in Skinner's version of behaviorism, is that
behavior that simply occurs and operates on the environment for which you can't
identify a specific causes that forces the behavior. Most human behavior would be
considered operant behavior. Responded behavior is that behavior of human being that
happens as result of reflexes rather than as a result of being rewarded. For example,
blinking our eyes when we get something in them is a respondent behavior. It is based
on a reflex, and there is a clear cause of that behavior.

7. Describe the act of learning according to Skinner.


Recalling that Skinner is a behaviorist who limits himself to describing observable
phenomena rather than speculation on any internal activities of the human mind,
learning is considered a change in behavior nothing more, nothing less. So when
talking about the act of learning learning Skinner is really talking about the act of
behavioral change. Behavior changes depending upon the consequences of that
behavior. If a specific behavior is reinforced then the probability of that behavior
occurring again is increased. This then is how the act of learning occurs because
learning is nothing more than a change in behavior in Skinner's view.
8. Discuss the role of stimuli (discriminating and reinforcing).
Stimuli play a key role in behaviorism. Stimuli that follower behavior and increase the
probability of that behavior occurring again are called reinforcing stimuli. These
reinforcing stimuli, or reinforcers, play a central role in learning without which learning
would not occur. Another type of stimuli are also important in a behavioral view. These
are the stimuli that set the occasion for or signal that a certain behavior will be followed
by reinforcement. Call discriminative stimuli. Think about these as traffic lights for
behaviors where a green light would indicate that a behavior is acceptable and will be
reinforced while a red light would indicate the behavior is not acceptable and will not be
reinforced. This is a matter in which we can control the behaviors of others to some
extent. An obvious example of this is the bell that would ring at the end of the class to
signal it's over. If the bell rings then the student can get up and run out of class to join
his friends in the hall or head to the playground. However, if the bell has not rung and
the student gets up and runs out of class it's a very different consequence. Likewise a
teacher can let students know that if they ask a question after she says "Are there any
questions?" then she will answer that question thoroughly and completely. However if
she has not said "Are there any questions?" and a student asks a question she will
ignore that question and not respond to it. Her statement "Are there any questions?" is a
discriminative stimulus that sets the occasion for reinforcement.

9. Describe interval and ratio schedules of reinforcement (both fixed and variable)
and their effect upon responses.
It is not necessary or even desirable to reinforce every occurrence of a behavior in order
for learning to occur. Rather learning can be reinforced on a variable schedule in which
each response itself does not result in reinforcement. For example, you may provide
reinforcement to a learner after every third correct response not after each correct
response. This would be an example of a ratio schedule of reinforcement because it is
based on providing reinforcement after certain number of responses. In the example
given reinforcement is provided for every third correct response this could be
considered a1:3 ratio schedule of reinforcement. It would be a fixed ratio schedule
because the ratio of every third response remains constant. Another type of ratio
schedule can happen when you vary how many responses have to happen before one
is reinforced. You may provide reinforcement after the sixth response, then after the
fourth response, then after the fifth response, then after the second response, then after
the eighth response, and so forth. Overall this might average out to be a reinforcement
after every fourth response in which case it would be called a 1:4 variable ratio schedule
of reinforcement. Another way to provide reinforcement is not based on the number of
responses but rather than amount of time that has elapsed since the last reinforcement
was given. These can be based on a fixed time interval or a variable time interval. In a
fixed interval schedule you would read reinforce the next response after that interval
had passed. Thus, you may reinforce the next response that occurs after a two-minute
interval past if this was a two-minute fixed interval schedule. The amount of time in an
interval schedule can also vary just as the number of responses in a variable ratio
schedule was not constant.
10. Distinguish between primary and secondary reinforcers.
Primary reinforcers of those things like food and water and air they don't have to be
learned in order to work as a reinforcer. Secondary reinforcers or things like praise and
money that have to be learned through being paired with primary reinforcers before they
have reinforcing value.

11. Describe Skinners ideas as applied to verbal behavior.


For Skinner behavior is behavior, and all behavior follows the same laws or principles.
He makes no distinction between verbal behavior and motor behavior. We learn to talk
and communicate the same way we learn motor behavior like how to walk. There is
nothing special about our verbal behavior. It's just another form of behavior.

12. Describe how Skinners theory accounts for motivation, drive, and creativity.
Skinner accounts for creativity in the same manner that he accounts for all other
behavior. There's nothing special about creativity in Skinner's system except that it's
much less likely to be seen in the general public. In the past some people have been
reinforced for behaviors that we define as creative. Because these behaviors have been
reinforced these people persist with these behaviors that we have defined as creative.
Thus, we call them creative people. Likewise with people who have a considerable drive
and who we are inclined to say are highly motivated. This is not some innate, internal
characteristic unique to these few people. Rather it's the logical consequence and what
happens when in the past people have been reinforced for setting goals and then
persisting with tasks and achieving the goals.

You might also like