<P> Operant conditioning, sometimes called instrumental learning, was first extensively studied by Edward L. Thorndike (1874--1949), who observed the behavior of cats trying to escape from home - made puzzle boxes . A cat could escape from the box by a simple response such as pulling a cord or pushing a pole, but when first constrained the cats took a long time to get out . With repeated trials ineffective responses occurred less frequently and successful responses occurred more frequently, so the cats escaped more and more quickly . Thorndike generalized this finding in his law of effect, which states that behaviors followed by satisfying consequences tend to be repeated and those that produce unpleasant consequences are less likely to be repeated . In short, some consequences strengthen behavior and some consequences weaken behavior . By plotting escape time against trial number Thorndike produced the first known animal learning curves through this procedure . </P> <P> Humans appear to learn many simple behaviors through the sort of process studied by Thorndike, now called operant conditioning . That is, responses are retained when they lead to a successful outcome and discarded when they do not, or when they produce aversive effects . This usually happens without being planned by any "teacher", but operant conditioning has been used by parents in teaching their children for thousands of years . </P> <P> B.F. Skinner (1904--1990) is often referred to as the father of operant conditioning, and his work is frequently cited in connection with this topic . His book "The Behavior of Organisms", published in 1938, initiated his lifelong study of operant conditioning and its application to human and animal behavior . Following the ideas of Ernst Mach, Skinner rejected Thorndike's reference to unobservable mental states such as satisfaction, building his analysis on observable behavior and its equally observable consequences . </P> <P> To implement his empirical approach, Skinner invented the operant conditioning chamber, or "Skinner Box", in which subjects such as pigeons and rats were isolated and could be exposed to carefully controlled stimuli . Unlike Thorndike's puzzle box, this arrangement allowed the subject to make one or two simple, repeatable responses, and the rate of such responses became Skinner's primary behavioral measure . Another invention, the cumulative recorder, produced a graphical record from which these response rates could be estimated . These records were the primary data that Skinner and his colleagues used to explore the effects on response rate of various reinforcement schedules . A reinforcement schedule may be defined as "any procedure that delivers reinforcement to an organism according to some well - defined rule". The effects of schedules became, in turn, the basic findings from which Skinner developed his account of operant conditioning . He also drew on many less formal observations of human and animal behavior . </P>

Who demonstrated that a reflex could be conditioned to a neutral stimulus