Introduction to Operant Conditioning

Operant Conditioning

The legacy that Burrhus Frederic Skinner left behind is the entire system based on operant conditioning, which was derived from his work in clinical psychology. Within “Clinical Psychology,” Sundberg and Winebarger indicate Skinner was an outstanding proponent of the behaviorist theory: Skinner was the most visible and most influential American psychologist in the second half of the 20th century; and he focused on observable behavior: his basic idea was that actions that are rewarded tend to be repeated.

This principle, actions are rewarded tend to be repeated, led Skinner to continually study behavior and in creation of numerous behavioral technologies:


Skinner’s clinical psychology led to research observation, which was spent discerning numerous animals’ reactions. These observations led Skinner to create the observation box an operant conditioning apparatus. His research led to Skinner’s system of operant conditioning – the belief that an organism is in the process of operating on the environment, which signifies that the organism is bouncing around its world, doing whatever it does.

During this operating time, the organism encounters a stimulus, which is called a reinforcing stimulus. This stimulus has the effect of increasing the operant the behavior occurring just before reinforces. Boeree explains in “B.F. Skinner, Personality Theories,” this operant conditioning is the behavior followed by a consequence, and the nature of the consequence modifies the organisms tendency to repeat the behavior.


Operant conditioning began with a box that Skinner created, which was built for rats. Within the Skinner Box, there was a pedal on one wall. When the bar was pressed, a little mechanism released food pellets into the cage. The rat would bounce around the cage, run around, and do whatever rats do. When the rat accidentally pressed the bar, the food pellets would fall into the cage. The rat quickly learned that he could press on the bar and receive food pellets; and the rat would continue pressing the bar and collecting food pellets into a corner within the cage.

Boreree explains: This behavior is known as operant conditioning the behavior just before the reinforcer. A behavior no longer followed by the reinforcer will eventually lead to that behavior no longer occurring; however, if the action leads to the reinforced stimulus again pellets being released the behavior will begin frequently occurring.


Within “Abnormal Psychology in a Changing World,” Nevid highlights Skinner’s beliefs: human behavior is the product of our genetic inheritance and environmental or situational influences. Skinner was a careful thinker about broad concerns for human welfare, which began with the application of his principles.

Skinner examined key philosophical issues from the perspective of operant conditioning, and had an abiding interest in education and the application of principles using teaching machines pre-computer devices that would present programmed instruction, materials, tests, and feedback. Skinner was thus, a strong experimental designer and valued the use of ideas and thinking, which led to operant conditioning.


B. F. Skinner’s legacy is his behavioristic and cognitive approaches, which led to a central role in learning approaches. Skinner used symbolic approaches, which led people to observe themselves in relation to each other and in relation to their effects on the environment. Skinner duly left a legacy – his work and the system of operant conditioning.