Term
|
Definition
Thorndike refered to this as the relationship between behavior and it's consequences. Behavior is a function of it's consequences. |
|
|
Term
|
Definition
Experiences whereby behavior is strengthened or weakened by its consequences. The behavior operates depending on the environment. The behavior is typically instrumental in producing important consequences, so this type of learning is also sometimes called instrumental learning. We act on the environment and change it, and the change thus produced strenghtens or weakens the behavior that produced the consequences. In operant learning, the individual must act. |
|
|
Term
|
Definition
An increase in the strength of behavior due to its consequence. An experience must have three characteristics to qualify as reinforcement: First, a behavior must have a consequence. Second, the behavior must increase in strength. Third, the increase in strength must be the result of the consequence. |
|
|
Term
|
Definition
A behavior is followed by the appearance of, or an increase in the intensity of, a stimulus aka a positive reinforcer. |
|
|
Term
|
Definition
Something an individual seeks out. Its effects is to strengthen the behavior that precedes it. When presented following a behavior, it must strengthen that behavior. |
|
|
Term
|
Definition
A behavior is strengthened by the removal of, or a decrease in the intensity of, a stimulus. What reinforces behavior in negative reinforcement is escaping from an aversive stimulus. Sometimes refered to as escape-avoidance learning. |
|
|
Term
|
Definition
Something an individual ordinarily tries to avoid. The only way to besure if an event is a negative reinforcer is to determine if its removal strengthens behavior. |
|
|
Term
|
Definition
The defining feature is that the behavior of the participant ends the trial. The dependent variable is often the time taken to perform some behavior under study. Ex. Thorndike's cat puzzle box. |
|
|
Term
|
Definition
In this approach, the behavior may be repeated any number of times. Usually the dependent variable in free operant experiments is the number of times a particular behavior, such as pressing a lever or pecking a disk, occurs per minute. Ex. Skinners rat operant chamber. |
|
|