Term
|
Definition
An increase or decrease in operant response as a function of the consequences that have the response. |
|
|
Term
|
Definition
Behavior that operates on the environment to produce a change, effect, or consequence. Particular responses increase or decrease in a situation as a function of the consequences that they produced int eh past. This type of behavior is said to be emitted (rather than elicited) in the sense that behavior may occur at some frequency before any known conditioning. |
|
|
Term
|
Definition
Any stimulus or event that increases the probability (rate of response) of an operant when presented. |
|
|
Term
|
Definition
The physical form or characteristics of the response. Related to the contingencies of reinforcement in the sense that the form of response can be broadened or restricted by contingencies. Generally it is a function of the contingencies of reinforcement. |
|
|
Term
|
Definition
A class or set of responses that vary in topography but produce a common environmental consequence or effect.
The response class of turning on the light has many variations in form (turn on the light with the left index finger, or with the right one, or with the side of the hand, or by saying to someone "Please turn on the light.") |
|
|
Term
|
Definition
Behavior occurs at some probability in the presence of a discriminative stimulus, but the sD does not force its occurrence. |
|
|
Term
|
Definition
Respondent (CR) and reflexive (UR) behavior is made to occur by the presentation of a stimulus (CS or US). |
|
|
Term
|
Definition
An event or stimulus that precedes an operant and sets the occasion for operant behavior (antecedent stimulus). |
|
|
Term
Differential Reinforcement |
|
Definition
In discrimination procedures, it involves reinforcement in the presence of one stimulus (Sd) but not in other settings (Sdelta). The result is that the organism comes to respond when the Sd is present and to show a low probability of responding in settings that have not resulted in reinforcement (Sdelta). An organism that shows this differential response is said to discriminate the occasion for reinforcement. |
|
|
Term
|
Definition
When an operant does not produce reinforcement, the stimulus that precedes the operant is called this. In the presence of this, the probability of emitting an operant declines. |
|
|
Term
Contingency of Reinforcement |
|
Definition
A definition of the relationship between the occasion, the operant class, and the consequences that follow the behavior (e.g., Sd: R --> Sr). We change them by altering one of the components and observing the effect on behavior. They can include more than three terms, as in conditional discrimination; also, the effectiveness of them depends on motivational events called establishing operations (e.g., deprivation and satiation). |
|
|
Term
|
Definition
A contingency that involves the presentation of an event or stimulus following an operant that increases the rate of response. |
|
|
Term
|
Definition
A contingency where an ongoing stimulus or event is removed (or prevented) by some response (operant) and the rate of response increases. If it is raining, opening and standing under an umbrella removes the rain and maintains the use of the umbrella on rainy days. |
|
|
Term
|
Definition
A procedure that involves the presentation of an event or stimulus following behavior that has the effect of decreasing the rate of response. For example, a child is given a spanking for running into the street, and as a result the probability of the behavior is decreased. |
|
|
Term
|
Definition
A contingency that involves the removal of an event or stimulus following behavior that has the effect of decreasing the rate of response. Behavior is maintained by positive reinforcement and the reinforcer is removed if a specified response occurs. The probability of response is reduced by the procedure. |
|
|
Term
|
Definition
When operant behavior increases by removing an ongoing event or stimulus, the contingency is called _____. |
|
|
Term
|
Definition
The contingency is called _____ when the operant increases by preventing the onset of the event or stimulus. |
|
|
Term
|
Definition
A higher-frequency behavior will function as reinforcement for a lower-frequency behavior. |
|
|
Term
|
Definition
The differences in relative frequency or probability of different responses in a free-choice or baseline setting. For a rat, the probability of eating, drinking, and wheel running might form this, with eating occurring most frequently and wheel running less often. |
|
|
Term
|
Definition
In the response-deprivation hypothesis, the instrumental response is the behavior that produces the opportunity to engage in some activity. |
|
|
Term
|
Definition
In the response-deprivation hypothesis, this is the activity obtained by making the instrumental response, as in if activity A occurs (instrumental response) then the opportunity to engage in activity B (_____) occurs. |
|
|
Term
Response-Deprivation Hypothesis |
|
Definition
The principle that organisms work to gain access to activities that are restricted or withheld (Deprivation), presumably to reinstate equilibrium or free-choice levels of behavior. This principle is more general than the Premack principle, predicting when any activity (high or low in rate) will function as reinforcement. |
|
|
Term
|
Definition
The time from the onset of one event to the onset of another (e.g., the time it takes a rat to reach a goal box after it has been released in a maze). |
|
|
Term
|
Definition
As originally stated by Thorndike, this law refers to stamping in (or out) some response. For example, a cat opened a puzzle-box door more rapidly over repeated trials. Currently the law is stated as the principle of reinforcement - operants may be followed by consequences that increase (or decrease) the probability or rate of response. |
|
|
Term
|
Definition
The number of responses that occur in a given interval. For example, a bird may peck a key for food 2 times per second, or a student may do math problems at the rate of 10 problems per hour. |
|
|
Term
in-vitro reinforcement (IVR) |
|
Definition
A method used to investigate reinforcement in the neuron. It involves increasing calcium bursts or firings by injection of dopamine agonists or other agents. |
|
|
Term
|
Definition
Alterations of neurons and neural interconnections during a lifetime by changes in environmental contingencies. |
|
|
Term
|
Definition
Another name for rate of Response. It is a measure of the probability of behavior. |
|
|
Term
|
Definition
The probability that an operant will occur on a given occasion (measured as the rate of response). |
|
|
Term
|
Definition
A method in which an organism may repeatedly respond over an extensive period of time. The organism is "free" to emit as many responses or none at all. More accurately, responses can be made without interference by the experimenter (as in a trials procedure). |
|
|
Term
|
Definition
A laboratory enclosure or box used to investigate operant conditioning. For a rat, it is a small, enclosed box that typically contains a lever with a light above it and a food magazine or cup connected to an external feeder. The feeder delivers a small food pellet when electronically activated. |
|
|
Term
|
Definition
the procedure of restricting access to a reinforcing event. Withholding an event or stimulus increases its effectiveness as a reinforcer. |
|
|
Term
|
Definition
A procedure that involves following the click of the feeder (stimulus) with the presentation of food (reinforcement). For example, a rat is placed in an operant chamber and a microcomputer periodically turns on the feeder. When the feeder is turned on, it makes a click and a food pellet falls into a cup. Because the click and the appearance of food are associated in time you would, after training, observe a typical rat staying close to the food magazine, quickly moving toward it when the feeder is operated. |
|
|
Term
|
Definition
An event or stimulus that has acquired its effectiveness to increase operant rate on the basis of an organism's life or ontogenetic history. |
|
|
Term
|
Definition
The rate of an operant before any known conditioning (e.g., the rate of key pecking before a peck-food contingency has been established) |
|
|
Term
Continuous Reinforcement (CRF) |
|
Definition
When each responses produces reinforcement (e.g., each lever press produces food). |
|
|
Term
|
Definition
All of the behavior that an organism is capable of emitting on the basis of species and environmental history. |
|
|
Term
|
Definition
The method of successive approximation may be used to establish a response. This method involves reinforcement of closer and closer approximations to the final performance. Many novel forms of behavior may be shaped by this method. |
|
|
Term
|
Definition
A method by which shaping takes place. |
|
|
Term
|
Definition
The tendency of an animal to emit variations in response form in a given situation. The range of it is related to the animal's capabilities based on genetic endowment, degree of neuroplasticity, and previous interactions with the environment. |
|
|
Term
|
Definition
Behavioral variability in a shaping procedure allows for selection by reinforcing consequences, and is analogous to the role of _____ in natural selection. |
|
|
Term
|
Definition
The selection of operant behavior during the lifetime of an organism. The process involves operant variability during periods of extinction and selection by contingencies of reinforcement. |
|
|
Term
|
Definition
A laboratory instrument that is used to record the frequency of operant behavior in real time (rate of response). For example, paper is drawn across a roller at a constant speed, and each time a lever press occurs a pen steps up on increment. When reinforcement occurs, the same pen makes a downward deflection. Once the pen reacher the top of the paper, it resets to the bottom and starts to step up again. |
|
|
Term
|
Definition
A real-time graphical representation of operant rate. Each response produces a constant upward increment on the y-axis, and time is indexed on the x-axis. |
|
|
Term
|
Definition
The body weight of an organism that has free access to food 24 hours a day. |
|
|
Term
|
Definition
Repeated presentations of a reinforcer weaken its effectiveness, and for this reason the rate of response declines. |
|
|
Term
changing life experiences |
|
Definition
An organism that alters its behavior (adaptation) on the basis of _____ is showing ontogenetic selection. |
|
|
Term
|
Definition
The faster the rate of response, the ____ the slope or rise of the cumulative record. |
|
|
Term
|
Definition
The repeated presentation of a reinforcer is called a _____. |
|
|
Term
|
Definition
In satiation, the rate of response _____ because repeated presentations of the reinforcer weaker its effectiveness. |
|
|
Term
|
Definition
The procedure of extinction involves the breaking of contingency between an operant and its consequence. For example, bar pressing followed by food reinforcement no longer produces food. As a behavioral process, it refers to the decline in the frequency of the operant when this procedure is in effect. |
|
|
Term
|
Definition
A rapid burst of responses when an extinction procedure is first implemented. |
|
|
Term
|
Definition
Operant behavior becomes increasingly more variable as extinction proceeds. From an evolutionary view, it makes sense to try different ways of acting when something no longer works. That is, behavioral variation increases the chances that the organism will reinstate reinforcement or contact other sources of reinforcement, increasing the likelihood of its survival and reproduction. |
|
|
Term
|
Definition
Reinforcement can be made contingent on the force or magnitude of response. Force or magnitude is a property or dimension of behavior. |
|
|
Term
|
Definition
When reinforcement is contingent on some difference in response properties, that form of response will increase. For example, the force or magnitude of response can be differentiated; if the contingencies of reinforcement require a forceful or vigorous response in a particular situation, then that form of response will predominate. |
|
|
Term
|
Definition
A response such as "wing flapping" in birds that occurs with the change in contingencies from reinforcement to extinction. One common emotional response is called aggression (attacking another organism or target). |
|
|
Term
|
Definition
A low rate of operant behavior that occurs as a function of an Sdelta. For example, the probability of putting coins in a vending machine with an "out of order" sign on it is very low. |
|
|
Term
|
Definition
The perseverance of operant behavior when it is placed on extinction. It is substantially increased when an intermittent schedule of reinforcement has been used to maintain behavior. |
|
|
Term
Intermittent Schedule of Reinforcement |
|
Definition
A schedule programmed so that some rather than all operants are reinforced. In other words, it is any schedule of reinforcement other than continuous reinforcement (CRF). |
|
|
Term
Partial Reinforcement Effect (PRE) |
|
Definition
Partial (or intermittent) reinforcement schedules generate greater resistance to extinction than continuous reinforcement (CRF). The higher the rate of reinforcement the greater the resistance to change; however, the change from CRF to extinction is discriminated more rapidly than between intermittent reinforcement and extinction. |
|
|
Term
|
Definition
After a period of extinction, an organism's rate of response may be close to operant level. After some time, the organism is again placed in the setting an extinction is continued. Responding initially recovers, but over repeated sessions of extinction the amount of recovery decreases. Repeated sessions of extinction eliminate stimulus control by extraneous features of the situation, and eventually "being placed in the setting" no loner occasions the operant. |
|
|
Term
|
Definition
The recovery of behavior when the reinforcer is presented alone (response independent) after a period of extinction. In an operant procedure, it involves reinforcement of a response followed by extinction. After extinction, response-independent reinforcement is arranged and the opportunity to respond is removed (using levers).This is followed by tests that reinstate the opportunity to respond. |
|
|