Term
|
Definition
An increase or decrease in operant response as a function of the consequences that followed the response. |
|
|
Term
|
Definition
A behavior that operates on the environment to produce a change, effect, or consequence. These environmental changes select the operant appropriate to a given setting or circumstance. That is, particular responses increase or decrease in a situation as a function of the consequences that they produced in the past. This behavior is emitted instead of elicited, in the sense that the behavior may occur at some frequency before any known conditioning. |
|
|
Term
|
Definition
Any stimulus or event that increases the probability (rate of response) of an operant when presented. |
|
|
Term
|
Definition
The physical form or characteristics of the response. The topography of response is related to the contingencies of reinforcement int he sense that the form of response can be broadened or restricted by the contingencies. |
|
|
Term
|
Definition
A class or set of responses that very in topography but produce a common environmental consequence or effect. The response class of turning on the light has many variations in form. |
|
|
Term
|
Definition
This behavior occurs at some probability in the sense that it occurs at some probability in the presence of a discriminative stimulus, but the descriminative stimulus does not force its occurrence. |
|
|
Term
|
Definition
Respondent and refluxive behavior is ______ in the sense that the behavior is made to occur by the presentation of a stimulus. |
|
|
Term
|
Definition
An event or stimulus that precedes an operant and sets the occasion for operant behavior. |
|
|
Term
Differential Reinforcement |
|
Definition
In discrimination procedures, differential reinforcement involves reinforcement in the presence of one stimulus (Sd) but not in other settings (S^). The result is that the organism comes to respond when the Sd is presented and to show a low probability of responding in settings that have not resulted in reinforcement (Sd). A differential response in Sd and S^ situations is called discrimination, and an organism that shows this differential response is said to discriminate the occasion for reinforcement. |
|
|
Term
|
Definition
When an operant does not produce reinforcement, the stimulus that precedes the operant is called an S-Delta. In the presence of an S-Delta, the probability of emitting an operant declines. |
|
|
Term
Contingency of Reinforcement |
|
Definition
A definition of the relationship between the occasion, the operant class, and the consequences that follow the behavior. We change the contingencies by altering one of the components and observing the effect on behavior. The effectiveness of reinforcement contingencies depends on motivational events called establishing operations. |
|
|
Term
|
Definition
positive reinforcement, negative reinforcement, positive punishment, negative punishment |
|
|
Term
|
Definition
A contingency that involves the presentation of an event or stimulus following an operant that increases the rate of response. |
|
|
Term
|
Definition
A contingency where an ongoing stimulus or event is removed by some response and the rate of response increases. When operant behavior increases by removing an ongoing event or stimulus, the contingency is called escape. The contingency is called avoidance when the operant increases by preventing the onset of the event or stimulus |
|
|
Term
|
Definition
A procedure that involves the presentation of an event or stimulus following behavior that has the effect of decreasing the rate of response |
|
|
Term
|
Definition
A contingency that involves the removal of an event or stimulus following behavior that has the effect of decreasing the rate of response. The negative punishment procedure requires that behavior is maintained by positive reinforcement and the reinforcer is removed if a specified response occurs. The probability of response is rescued by the procedure. |
|
|
Term
|
Definition
A higher-frequency behavior will function as reinforcement for a lower frequency behavior. |
|
|
Term
|
Definition
The differences in relative frequency or probability of different responses in a free-choice or baseline setting. |
|
|
Term
|
Definition
In the response-deprivation hypothesis, the instrumental response is the behavior that produces the opportunity to engage in some activity. |
|
|
Term
|
Definition
In the response-deprivation hypothesis is obtained by making the instrumental response, as in the contingency if activity A occurs then the opportunity to engage in activity B occurs. |
|
|
Term
|
Definition
The principle that organisms work to gain access to activities that are restricted or withheld (deprivation), presumably to reinstate equilibrium or free-choice levels of behavior. |
|
|
Term
|
Definition
The time from the onset of one event to the onset of another. |
|
|
Term
|
Definition
This law refers to stamping in (or out) some response. The principle of reinforcement operants may be followed by consequences that increase (or decrease) the probability. |
|
|
Term
|
Definition
The number of responses that occur in a given interval |
|
|
Term
|
Definition
A method used to investigate reinforcement in the neuron. It involves increasing calcium bursts or firings by injection of dopamine agonists or agents. |
|
|
Term
|
Definition
Alterations of neurons and neural interconnections during a lifetime by changes in environmental contingencies. |
|
|
Term
|
Definition
The probability that an operant will occur on a given occasion. |
|
|
Term
|
Definition
The number of responses that occur in a given interval. |
|
|
Term
|
Definition
A method in which an organism may repeatedly respond over an extensive period of time. The organism is "free" to emit many responses or none at all. More accurately, response can be made without interference by the experimenter. |
|
|
Term
|
Definition
A laboratory enclosure or box used to investigate operant conditiong. |
|
|
Term
|
Definition
The procedure of restricting access to a reinforcing event. Withholding an event or stimulus increases its effectiveness as a reinforcer. |
|
|
Term
|
Definition
A procedure that involves following the click of the feeder (stimulus) with the presentation of food. |
|
|
Term
|
Definition
An event or stimulus that has acquired its effectiveness to increase operant rate on the basis of an organism's life or ontogenetic history. |
|
|
Term
|
Definition
A class or set of responses that vary in topography but produce a common environmental consequence or effect. |
|
|
Term
|
Definition
The rate of an operant before any known conditioning. |
|
|
Term
|
Definition
When each response produces reinforcement |
|
|
Term
|
Definition
All of the behavior that an organism is capable of emitting on the basis of species and environmental history. |
|
|
Term
Shaping/Successive Approximation |
|
Definition
The method of successive approximation or shaping may be used to establish a response. This method involves the reinforcement of closer and closer approximations to the final performance. Each step of the procedure involves reinforcement of closer approximations and non reinforcement of more distant response. Many novel forms of behavior many be shaped by the method of successive approximation. |
|
|
Term
|
Definition
The tendency of an animal to emit variations in response form in a given situation. The range of behavior variation is related to the animal's capabilities based on genetic endowment, degree of neuroplasticity, and previous interactions with the environment. Behavior avirability in a shaping prcedure allows for selection by reinforcing consequences, and is analogous to the role of genetic variability in natural selection. |
|
|
Term
|
Definition
The selection of operant behavior during the lifetime of an organism. The process involves operant variability during periods of extinction and selection by contingencies of reinforcement. An organism that alters its behavior on the basis of changing life experiences is showing ontogenetic selection. The topography and frequency of behavior increase when reinforcement is withheld. |
|
|
Term
|
Definition
A laboratory instrument that is used to record the frequency of operant behavior in real time. |
|
|
Term
|
Definition
A real-time graphical representation of operant rate. Each response produces a constant upward increment on the y-axis and time is indexed ont he x-axis. The faster the rate of response, the steeper the slope or rise of the cumulative record. |
|
|
Term
|
Definition
The body weight of an organism that has free access to food 24h a day |
|
|
Term
|
Definition
Repeated presentations of a reinforcer weaken its effectiveness and for this reason the rate of response declines. |
|
|
Term
|
Definition
Involves the breaking of contingency between an operant and its consequence. |
|
|
Term
|
Definition
A rapid burst of response when an extinction procedure is first implemented. |
|
|
Term
|
Definition
Operant behavior becomes increasingly more variable as extinction proceeds. From an evolutionary view, it makes sense to try different ways of acting when something no longer works. Behavioral variation increases the chances that the organism will reinstate reinforcement which increases the likelihood of its survival and reproduction. |
|
|
Term
|
Definition
Reinforcement can be made contingent on the force or magnitude of response. Force or magnitude is a property or dimension of behavior. |
|
|
Term
|
Definition
When reinforcement is contingent on some difference in response properties that form a response will increase. For example, the force or magnitude of response can be differentiated, if contingencies of reinforcement require a forceful or vigorous response in a particular situation then that form of response will predominate. |
|
|
Term
|
Definition
A response such as "wing flapping" in birds that occurs with the change in contingencies from reinforcement to extinction. Aggression is a common response |
|
|
Term
Discrimination Extinction |
|
Definition
A low rate of operant behavior that occurs as a function of an S-Delta. The probability of putting coins in a vending machine with an "out of order" sign on it is very low. |
|
|
Term
|
Definition
The perseverance of operant behavior when it is placed on extinction. Resistance to extinction is substantially increased when an intermittent schedule of reinforcement has been used to maintain behavior. |
|
|
Term
Intermittent Schedule of Reinforcement |
|
Definition
A schedule programmed so that some rather than all operants are reinforced. In other words an intermittent schedule is any schedule of reinforcement other than continuous reinforcement. |
|
|
Term
Partial Reinforcement Effect (PRE) |
|
Definition
These schedules generate greater resistancec to extinction than continuous reinforcement. The higher the rate of reinforcement the greater the resistance to change however the change from CRF to extinction is discriminated more rapidly than between intermittent reinforcement and extinction. |
|
|
Term
|
Definition
After a period of extinction, an organism's rate of response may be close to operant level. After some time, the organism is again placed in the setting and extinction is continued. Responding initially recovers, but over repeated sessions of extinction the amount of recovery decreases. |
|
|
Term
|
Definition
The recoery of behavior when the reinforcer is presented alone after a period of extinction. In an operant procedure, reinstatement involves reinforcement of a response followed by extinction. After extinction, response-independent reinforcement is arranged and the opportunity to respond is removed. That is followed by tests that reinstate the opportunity to respond. |
|
|