Term
|
Definition
| Experimental Research - One-Way Designs |
|
|
Term
| In an experiment, the investigator: |
|
Definition
1) Manipulates the Independent Variable (or variables).
2) Assesses the impact of these different experiences. |
|
|
Term
| Manipulates the Independent Variable (or variables) |
|
Definition
| Arranges different experiences for the research participants. |
|
|
Term
| Assesses the impact of these different experiences |
|
Definition
| One or more measured variables (DVs). |
|
|
Term
| How do we determine whether one event causes another event: Does A cause B? |
|
Definition
1) Association 2) Temporal Priority 3) Control of Common Causal Variables |
|
|
Term
|
Definition
| Is there a correlation, between an independent and a dependent variable. |
|
|
Term
| The causal relationships between variables in behavioral science are: |
|
Definition
|
|
Term
|
Definition
| If event A occurs before event B, then A might be causing B; If event A occurs after event B, then A cannot be causing B. |
|
|
Term
| Control of Common Causal Variables |
|
Definition
Causal statements: Require ruling out the influence of common-causal variables.
Experimental manipulations: The process through which the researcher rules out the possibility that the relationship between the independent and dependent variables is spurious. |
|
|
Term
|
Definition
| Require ruling out the influence of common-causal variables. |
|
|
Term
| Experimental manipulations |
|
Definition
| The process through which the researcher rules out the possibility that the relationship between the independent and dependent variables is spurious. |
|
|
Term
| A one-way experimental design |
|
Definition
1) Has one independent variable 2) The independent variable is created (manipulated) 3) Levels |
|
|
Term
|
Definition
| Refers to the specific situations that are created within the manipulation; In one-way designs, these are called the "experimental conditions". |
|
|
Term
| Equivalence can be created through |
|
Definition
Between-participants designs with different but equivalent participants in each level of the experiment
OR
Repeated-measures designs with the same people in each of the experimental conditions (also called “Within-subjects” design). |
|
|
Term
| Between-participants design |
|
Definition
| Different but equivalent participants in each level of the experiment |
|
|
Term
|
Definition
| the same people in each of the experimental conditions (also called “Within-subjects” design). |
|
|
Term
| Random assignment to conditions |
|
Definition
The most common method of creating equivalence among the experimental conditions.
The level of the independent variable each participant will experience is determined through a random process. |
|
|
Term
| Variety and Number of Levels |
|
Definition
1) Experimental condition 2) Control condition |
|
|
Term
|
Definition
| Level of the independent variable in which the situation of interest was created. |
|
|
Term
|
Definition
| Level of the independent variable in which the situation was not created. |
|
|
Term
|
Definition
1) Experimental designs with only two levels have some limitations. 2) Difficulty drawing conclusions about the pattern of the relationship where the manipulation varies the strength of the independent variable. |
|
|
Term
| Experimental designs with only two levels have some limitations: |
|
Definition
1) Difficulty telling which of the two levels is causing a change in the dependent measure. 2) Is level 1 going up and level 2 down. 3) or, is level 1 going up and level 2 stable… |
|
|
Term
| Difficulty drawing conclusions about the pattern of the relationship where the manipulation varies the strength of the independent variable. |
|
Definition
| A control condition solves some of these issues. |
|
|
Term
| Detecting Curvilinear Relationships |
|
Definition
| An experimental design with only two levels cannot detect curvilinear relationships. |
|
|
Term
| Analysis of Variance (ANOVA) |
|
Definition
1) Compares the means of the dependent variable across the levels of an experimental research design. 2) Analyzes the variability of the dependent variable. 3) If the means are equivalent, there should be no differences among them except those due to chance. 4) If the experimental manipulation has influenced the dependent variable, there will be significantly more variability among them. |
|
|
Term
| One-way analysis of variance (ANOVA) |
|
Definition
1) Used to compare the means on a dependent variable between two or more groups of participants who differ on a single independent variable. 2) Ho = m1 = m2 = ... = mk |
|
|
Term
| Computation of a One-Way Between-Participants ANOVA |
|
Definition
|
|
Term
|
Definition
| Variance among the condition means. |
|
|
Term
|
Definition
| Variance within the conditions. |
|
|
Term
| The F statistic is calculated as the ratio of the two variances: |
|
Definition
F=Between groups variance/Within groups variance
1) Calculating the within-groups mean squares 2) Calculating the between-groups mean squares |
|
|
Term
| Between groups variance is significantly greater than the within-groups variance |
|
Definition
1) Conclude the manipulation has influenced the dependent measure. 2) The null hypothesis that all the condition means are the same is rejected. |
|
|
Term
| Conclude the manipulation has influenced the dependent measure: |
|
Definition
| P-value is less than alpha. |
|
|
Term
|
Definition
|
|
Term
|
Definition
|
|
Term
|
Definition
| Most experimental research designs include more than one independent variable. |
|
|
Term
|
Definition
| Refers to each of the manipulated independent variables. |
|
|
Term
|
Definition
| Refers to each condition within a particular independent variable. |
|
|
Term
| Factorial research designs: |
|
Definition
1) a 2 x 3 Design 2) a 2 x 2 Design 3) a 2 x 2 x 4 Design |
|
|
Term
|
Definition
Two factors: 1) The first factor has 2 levels. 2) The second factor has 3 levels. |
|
|
Term
|
Definition
Two factors: 1) Both factors have 2 levels. |
|
|
Term
|
Definition
Three factors: 1) Factor one and two have 2 levels. 2) Factor three has 4 levels. |
|
|
Term
|
Definition
| The conditions in factorial designs. |
|
|
Term
|
Definition
1) All participants are completing the same DV. 2) Differences will be due to the influence of the IVs. |
|
|
Term
| Two-way factorial experimental design |
|
Definition
1) Two factors are manipulated in the same experiment. 2) Each level of one independent variable occurs with each level of the other independent variable. 3) The research hypothesis makes a very specific prediction about the pattern of means expected to be observed on the dependent measure. |
|
|
Term
| Schematic Diagram of a Factorial Design: |
|
Definition
| Greater than (>) and less than (<) signs show the expected relative values of the means. DV = Aggressive Play |
|
|
Term
|
Definition
When means are combined across the levels of another factor.
Ex in class: Marginal means for Cartoon Type - 4.15 & 2.71 (When means are combined across the levels of another factorial) |
|
|
Term
|
Definition
Differences on the dependent measure across the levels of any one factor, controlling for all other factors in the experiment.
I.E. If 4.15 & 2.71 are significantly different, then - Main Effect of Cartoon Type.
If 2.97 & 3.90 are significantly different, then - Main Effect of Prior State. |
|
|
Term
|
Definition
| When the influence of one independent variable on the dependent variable is different at different levels of another independent variable or variables. |
|
|
Term
| "Simple effect" of the first factor |
|
Definition
The effect of one factor within a level of another factor
I.E. If 2.68 & 3.25 are significantly different, then - Simple effect of cartoon type within the frustrated condition. |
|
|
Term
| ANOVA Summary Table - Each main effect and each interaction has its own: |
|
Definition
1) F test 2) Degrees of freedom 3) p-value |
|
|
Term
| Understanding Interactions |
|
Definition
| Visualize the relationships among variables using a line chart... |
|
|
Term
| Visualize the relationships among variables using a line chart: |
|
Definition
1) Levels of one factor are indicated on the horizontal axis. 2) The dependent variable is represented and labeled on the vertical axis. 3) Points represent the observed mean on the dependent variable in each of the experimental conditions. 4) Lines connect the points indicating each level of the second independent variable. |
|
|
Term
| Patterns with Main Effects Only |
|
Definition
| A main effect of the cartoon variable, but no interaction. (with parallel lines) - Check online graphic. |
|
|
Term
| Patterns with Main Effects and Interactions |
|
Definition
| Lines are not parallel = interaction. |
|
|
Term
|
Definition
| No significant main effects, only an interaction. |
|
|
Term
| Interpretation of Main Effects When Interactions are Present |
|
Definition
| When there is a statistically significant interaction between the two factors, the main effects of each factor must be interpreted with caution. |
|
|
Term
| The Three-Way Design (Three-way ANOVA) |
|
Definition
1) Involves three independent variables. 2) Each factor has two or more levels. 3) The ANOVA summary table includes... |
|
|
Term
| The ANOVA summary table includes: |
|
Definition
3 - main effects 3 - two-way interactions 1 - three-way interaction
Gender as third IV: quasi-experimental design. |
|
|
Term
|
Definition
|
|
Term
| The Three-Way Interaction |
|
Definition
When a three-way interaction is found, the two-way interactions and the main effects must be interpreted with caution.
Having identified that the three IVs together influence the DV, we need to be careful if we remove one of the variables. |
|
|
Term
| "Factorial Designs Using Repeated Measures" - Any or all of the factors in factorial research designs may involve repeated measures. |
|
Definition
2 x 2 between N=120 2 x 2 within N=30 2 x 2 mixed N=60 |
|
|
Term
|
Definition
| Some factors are between participants and some are repeated measures (within subjects design). |
|
|
Term
| Repeated-Measures Factorial Design |
|
Definition
1) Participants go through every measure (all IVs and DVs). 2) 2 IVs 3) Both IVs are “Within Subjects” (i.e. all participants are exposed to all levels of both IVs). |
|
|
Term
| 2 IVs (Repeated-Measures Factorial Design) |
|
Definition
1) Social Context (Alone vs. w/ Group). 2) Task Difficulty (Easy vs. Difficult). |
|
|
Term
|
Definition
Same 2 IVs: 1) Task Difficulty is a “Within Subjects” variable. 2) Social Context is a “Between Subjects” variable. |
|
|
Term
| Comparison of the Condition Means in Experimental Designs |
|
Definition
1) Means comparisons 2) Pairwise comparison 3) Experimentwise alpha 4) Planned comparisons or a priori comparisons 5) Post hoc comparisons 6) Complex comparisons |
|
|
Term
|
Definition
Conducted because a significant F value does answer which groups are significantly different from each other.
Determine which group means are significantly different from each other. |
|
|
Term
|
Definition
Any one condition mean is compared with any other condition mean.
It may not be appropriate to conduct a statistical test on each pair of condition means. |
|
|
Term
|
Definition
The probability of having made a Type 1 error in at least one of the comparisons.
As the number of comparisons increases, the experimentwise alpha also increases.
Experimentwise alpha = alpha x number of comparisons, i.e.:
αe = .05 x 6 = .30 (2 x 2 design) αe = .05 x 15 = .75 (2 x 3 design) αe = .05 x 36 = 1.80 (3 x 3 design) |
|
|
Term
| Planned comparisons or a priori comparisons |
|
Definition
1) Only specific differences which were predicted by the research hypothesis are tested. 2) Reduces experimentwise alpha. 3) Post Hoc Comparisons. |
|
|
Term
|
Definition
1) Take into consideration that many comparisons are being made. 2) Performed only if the F test is significant. 3) Reduces experimentwise alpha. 4) Complex Comparisons. |
|
|
Term
|
Definition
1) More than two means are compared at the same time. 2) Usually conducted with contrast tests. 3) Reduces experimentwise alpha. |
|
|
Term
|
Definition
| Always keep your Hypothesis in mind |
|
|
Term
| Always keep your Hypothesis in mind |
|
Definition
1) What is your DV? 2) What is your first IV? 3) What is your second IV? 4) What is the relationship between your IVs? (Do you expect an interaction? If so, which cells of your design should you compare (simple effects) |
|
|