You are here: Home » Psychology » Operant Conditioning Theory

Operant Conditioning Theory

Summary of B.F. Skinner and his “skinner box” experiment/theory.

            By definition, operant conditioning is “the behavior followed by a consequence, and the nature of the consequence modifies the organisms tendency to repeat the behavior in the future” (Boeree).  In simplified terms, when an animal or human acts on something to receive a reward, they are most likely to repeat the action.  Likewise, if an organism receives negative feedback from an action, that organism is unlikely to repeat that action.  Dr. Burrhus F. Skinner, more commonly known as B.F. Skinner, developed this concept and sought to prove it in the early to mid 1900’s.  His theory was then tested with the infamous ‘Skinner box’ (Boeree).

            Prior to his experiments in behavioral psychology, Fred Skinner, as his family knew him, enjoyed building various mechanisms and conducting science experiments with his younger brother.  Although he grew up a curious and intellectual child, it was not until high school that B.F. Skinner became truly infatuated with the inductive method in science (Moloney).  He continued on at Hamilton University in Clinton, New York where he struggled at first, but then spring boarded to Harvard University in 1928 (Moloney).  It was at Harvard that he received his masters and doctorate that enabled him to write various texts on psychology.  Still interested in the inductive method, B.F. Skinner began the ‘Skinner box’ experiment.

            In the ‘Skinner box’ experiment, a rat was placed in a cage with a pedal on one side.  When the rat pushed the pedal, (the behavior) a food pellet, (the reinforcement) would drop into the cage.  The rat quickly recognized that multiple compressions yielded multiple pellets (Skinner).  However, there were not mass amounts of these pellets; they had to be made by hand.  Skinner, seizing the opportunity to manipulate the situation, established the schedules of reinforcement. 

            The original schedule was also called continuous reinforcement, which is where one pedal press produced one food pellet (Moloney).  In the beginning of his experimentation, Skinner discovered the fixed ratio schedule.  With this schedule, for every 3 times, 5 times, or even 20 times the pedal was pushed, the rat would receive 1 treat.  Skinner’s fixed interval schedule uses time as a variable as opposed to pedal-press repetitions (Moloney).  With this schedule however, the rats would speed up their pressing as the time approached for the pellet to drop, as opposed to waiting patiently (Boeree).  The last schedule, known as a variable schedule was inconsistent, using neither time nor repetition as a variable, which Skinner compared to the aspect of gambling in humans. 

            The primary concept that many modern psychologists have gained from B.F. Skinner’s experiments is the idea of behavior modification.  The notion is that by removing reinforcements during undesired behavior and in turn supplying them during desired behavior, the bad behavior will be extinguished (Boeree).  To put it simply, an individual is rewarded with good behavior, and not rewarded with bad behavior.  This theory of behavior modification has become so popular over the past several decades, that many parents and authoritative figures use these techniques with children.  Although it has been modified, the underlying theory remains the same.

Boeree, George. B.F. Skinner. N.p. n.d. Web. 9 Oct. 2011

Moloney, Sean. Skinner, Burrhus Frederic Fred. N.p. Summer 2009. Web. 10 Oct. 2011

Skinner, B.F. “’Superstition’ in the Pigeon.” Journal of Experimental Psychology 38 (1947):  168-172. Print. 9 Oct. 2011

0
Liked it
Powered by Powered by Triond
-->