Key+Contributors


 * Contributors to the Study of Instrumental Conditioning **

Although their experiments were performed long ago, the contributions made by psychologists Edward Lee Thorndike and B.F. Skinner made lasting impressions on the field of behavioral psychology. Even today, "the most familiar images of instrumental and operant conditioning are the maze and the "Skinner box" (Abramson, 1994, p. 152).

__**Edward Lee Thorndike **__

 Edward Lee Thorndike was a nineteenth century American psychologist who largely contributed to the field of behavioral psychology by studying animals (Reinemeyer, 1999). Some of his best known studies are his 1898 "puzzle box" experiments (Fig. 1), where he would observe the behaviors of cats. In these trials, cats were placed into puzzle boxes - wooden crates with a rigged door that could be opened by the animal inside through a trip mechanism (Terry, 2009). The cat inside the puzzle box would have to perform different tasks to activate the door opening mechanism and escape from the box. Thorndike intentionally chose tasks that were new, unlearned behaviors to the cats, such as pulling a string or pushing objects, because he wanted to study their ability to practice these behaviors (Terry, 2009). Thorndike's puzzle box experiments lead to many important factors that we now consider a part of instrumental learning, such as the concepts of trial and error, stereotyping, and the law of effect.

Thorndike's findings strengthened the idea that all behaviors are shaped by their consequences. This idea is what Thorndike described as his law of effect. He concluded that all learning was based on the presence of a reward rather than logical thinking, and therefore occurred automatically as opposed to occurring through intelligent thinking. "As intelligent as many animals (including humans) are, much of their learning is nevertheless governed by trial and error (Terry, 2009, p. 89).

__**B.F. Skinner **__

B.F. Skinner began studying principles of learning by reinforcement through experiments in the 1930's and "used the label operant conditioning to indicate that the response operates on the environment to produce a certain outcome" (Terry, 2009, p. 123). In his publication //A Brief Survey of Operant Behavior,// Skinner acknowledged Thorndike's experiments, however he noted that the process of learning was not one of trial and error. He emphasized that learning is strictly related to a behaviors rewards and consequences. Like Thorndike, Skinner reached his conclusions by conducting experiments with animals. Psychology Professor Charles Brewer, PhD, from Furman University recently commented on Skinner's acknowledgements, stating that "the work he did with pigeons and rats in the laboratory has been applied more widely in real-world applications than any other psychologist's" (Greengrass, 2004, p. 1).

To condition animals for his studies, Skinner created what is now called a "skinner box" (Fig. 2). In his "Skinner box" experiments, a rat or pigeon was placed inside a small chamber where its actions (such as pulling a lever), were followed by reinforcing stimuli (such as food). "Skinner coined the label //operant response//, as a contrast to the Pavlovian conditioned response, to indicate that the subject's response operates on the environment to produce a certain outcome" (Terry, 2009, p. 90). Skinner's contributions to his field were so beneficial that Skinner boxes are still in fact used today to study instrumental learning.

  Despite the evidence collected by these psychologists to illustrate the importance of reinforcement in one's learning ability, there are still some criticisms that exist today. To read about some of these objections to the ideas of instrumental conditioning, click here.