Experimental analysis of behavior

The experimental analysis of behavior is school of thought in psychology founded on B. F. Skinner's philosophy of radical behaviorism and defines the basic principles used in applied behavior analysis. A central principle was the inductive reasoning data-driven[1] examination of functional relations, as opposed to the kinds of hypothetico-deductive learning theory[2] that had grown up in the comparative psychology of the 1920–1950 period. Skinner's approach was characterized by observation of measurable behavior which could be predicted and controlled. It owed its early success to the effectiveness of Skinner's procedures of operant conditioning, both in the laboratory and in behavior therapy.

Basic learning processes in behavior analysis

Classical (or respondent) conditioning

In classical or respondent conditioning, a relatively neutral stimulus (conditioned stimulus) comes to signal the occurrence of a biologically significant stimulus (unconditioned stimulus) such as food or pain. This typically done by repeatedly pairing the two stimuli, as in Pavlov's experiments with dogs, where a bell was followed by food. As a result, the conditioned stimulus yields a conditioned response that is usually similar to the unconditioned response elicited by unconditioned stimulus (e.g., salivation in Pavlov's dogs).[3]

Operant conditioning

Operant conditioning (also, "instrumental conditioning") is a learning process in which behavior is sensitive to, or controlled by its consequences. For example, behavior that is followed by reward (reinforcement) becomes more probable, whereas behavior that is followed by punishment becomes less probable.[4] Many variations and details of this process may be found in the main article.

Experimental tools in behavioral research

Operant conditioning chamber

The most commonly used tool in animal behavioral research is the operant conditioning chamber—also known as a Skinner Box. The chamber is an enclosure designed to hold a test animal (often a rodent, pigeon, or primate). The interior of the chamber contains some type of device that serves the role of discriminative stimuli, at least one mechanism to measure the subject's behavior as a rate of response—such as a lever or key-peck switch—and a mechanism for the delivery of consequences—such as a food pellet dispenser or a token reinforcer such as an LED light.

Cumulative recorder

Of historical interest is the cumulative recorder, an instrument used to record the responses of subjects graphically. Traditionally, its graphing mechanism has consisted of a rotating drum of paper equipped with a marking needle. The needle would start at the bottom of the page and the drum would turn the roll of paper horizontally. Each subject response would result in the marking needle moving vertically along the paper one tick. This makes it possible for the rate of response to be calculated by finding the slope of the graph at a given point. For example, a regular rate of response would cause the needle to move vertically at a regular rate, resulting in a straight diagonal line rising towards the right. An accelerating or decelerating rate of response would lead to a quadratic (or similar) curve. For the most part, cumulative records are no longer graphed using rotating drums, but are recorded electronically instead.

Key concepts

Laboratory methods employed in the experimental analysis of behavior are based upon B.F. Skinner's philosophy of radical behaviorism, which is premised upon:

  1. Everything that organisms do is behavior (including thinking), and
  2. All behavior is lawful and open to experimental analysis.
  3. Central to operant conditioning is the use of a Three-Term Contingency (Discriminative Stimulus, Response, Reinforcing Stimulus) to describe functional relationships in the control of behavior.
  • Discriminative stimulus (SD) is a cue or stimulus context that sets the occasion for a response. For example, food on a plate sets the occasion for eating.
  • Behavior is a response (R), typically controlled by past consequences and also typically controlled by the presence of a discriminative stimulus. It operates on the environment, that is, it changes the environment in some way.
  • Consequences can consist of reinforcing stimuli (SR) or punishing stimuli (SP) which follow and modify an operant response. Reinforcing stimuli are often classified as positively (Sr+) or negatively reinforcing (Sr−). Reinforcement may be governed by a schedule of reinforcement, that is, a rule that specifies when or how often a response is reinforced. (See operant conditioning).
  1. Respondent conditioning is dependent on stimulus-response (SR) methodologies (unconditioned stimulus (US), conditioned stimulus (CS), neutral stimulus (NS), unconditioned response (UR), and conditioned response, or CR)
  2. Functional analysis (psychology)
  3. Data collection

Anti-theoretical analysis

The idea that Skinner's position is anti-theoretical is probably inspired by the arguments he put forth in his article Are Theories of Learning Necessary?[5] However, that article did not argue against the use of theory as such, only against certain theories in certain contexts. Skinner argued that many theories did not explain behavior, but simply offered another layer of structure that itself had to be explained in turn. If an organism is said to have a drive, which causes its behavior, what then causes the drive? Skinner argued that many theories had the effect of halting research or generating useless research.

Skinner's work did have a basis in theory, though his theories were different from those that he criticized. Mecca Chiesa notes that Skinner's theories are inductively derived, while those that he attacked were deductively derived.[6] The theories that Skinner opposed often relied on mediating mechanisms and structures—such as a mechanism for memory as a part of the mind—which were not measurable or observable. Skinner's theories form the basis for two of his books: Verbal Behavior, and Science and Human Behavior. These two texts represent considerable theoretical extensions of his basic laboratory work into the realms of political science, linguistics, sociology and others.

Notable figures

  • Charles Ferster – pioneered Errorless learning, which has since become a commonly used form of Discrete trial training (DTT) to teach autistic children, and co-authored the Schedules of Reinforcement book alongside B. F. Skinner.
  • Richard Herrnstein – developed the matching law, a mathematical model for decision making, co-authored the controversial The Bell Curve.
  • James Holland – co-wrote the highly cited and well-known Principles of Behavior with B.F. Skinner.
  • Fred S. Keller – creator of the Personalized System of Instruction (PSI).
  • Ogden Lindsley – founder of the Precision Teaching approach to teaching.
  • Jack Michael – noted verbal behavior and motivating operations theorist and researcher.
  • John Anthony (Tony) Nevin – development behavioral momentum
  • Howard Rachlin – pioneer in self-control research and behavioral economics.
  • Murray Sidman – discovered Sidman Avoidance, highly cited author, researcher on punishment, also has been influential in research on stimulus equivalence.
  • Philip Hineline – contributed extensively to negative reinforcement (escape/avoidance), molecular/molar accounts of behavior processes, and the characteristics of interpretive language.
  • Allen Neuringer – well known for theoretical work including volition perception, randomness, self-experimentation, and other areas.
  • Peter B. Dews principal founder of behavioral pharmacology [7]

References

  1. Chiesa, Mecca: Radical Behaviorism: The Philosophy and the Science (2005)
  2. Skinner, B.F.: Are Theories of Learning Necessary? (1951) s
  3. Skinner, B.F.: The Evolution of Behavior (1984)
  4. Catania, A.C. "Learning" Prentice-Hall 1992
  5. Skinner, B.F. (July 1950). "Are theories of learning necessary?". Psychol Rev. 57 (4): 193–216. doi:10.1037/h0054367. PMID 15440996.
  6. Chiesa, Mecca (2005) Radical Behaviorism: The Philosophy and the Science
  7. Barrett, James E. (Spring 2013). "Peter B. Dews (1922–2012)". Behav. Anal. 36 (1): 179–182. doi:10.1007/BF03392303. PMC 3640885.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.