Term
Define 'independent samples/between-groups design' |
|
Definition
|
|
Term
Define 'related samples/within-groups design |
|
Definition
|
|
Term
|
Definition
|
|
Term
|
Definition
subgrouping of the independent variable(s) |
|
|
Term
|
Definition
tests for differences between the means of more than two groups |
|
|
Term
Define 'between group variance' |
|
Definition
|
|
Term
Define 'within groups variance' |
|
Definition
Variance not due to the IV |
|
|
Term
Define 'post-hoc analysis' |
|
Definition
Tests that are done to determine the true source of difference between 3 or more groups. The post-hoc analysis tests for all pair-wise combinations. |
|
|
Term
|
Definition
used to determine if each of the inequalities is statistically significant |
|
|
Term
|
Definition
Analysis of variance with two or more IV's or factors |
|
|
Term
|
Definition
When a factor or IV has a significant effect upon the outcome variable. Qualified by the presence of a significant interaction |
|
|
Term
|
Definition
means for specific individual variables, ignoring all others |
|
|
Term
|
Definition
break from parallelism, an outcome in which the effect of one factor is different, depending on the level of another factor |
|
|
Term
|
Definition
Analysis of variance with one or more IV's and two or more DV's. It assesses change across levels of IV |
|
|
Term
Compare and contrast designs where you would conduct a t-test, one-way ANOVA, and factorial ANOVA. |
|
Definition
T-test = 1 IV w/ 2 Levels & 1DV
One-Way ANOVA = 1 IV w/ 2+ Levels & 1 DV
Factorial ANOVA= 2 IV's & 1 DV |
|
|
Term
Describe an experiment where each (t-test, one way ANOVA, factorial ANOVA) would be used. |
|
Definition
t-test: You're measuring the effect of political party affiliation on charity involvement. (1 IV, 2 levels) One-way ANOVA: you're testing the effects of a new drug against an established drug and a control group (1 IV, 3 levels) Factorial ANOVA: Does gender or grade level affect how often kids play on the slide at recess? (2 IV's, 1 DV) |
|
|
Term
Describe the logic behind the calculation of the f-ratio. If parts of the ANOVA printout were missing, be able to fill in blanks using info provided. |
|
Definition
The f-ratio assesses whether the variance in two independent samples are equal. If the f-ratio is not statistically significant, the variance between the samples is relatively even.I |
|
|
Term
Discuss when and why post hoc contrasts are used in ANOVA. |
|
Definition
Post hoc tests are used when you are comparing differences between 3 or more levels of an IV. It gives a numerical way of assessing whether the difference is significant or not. |
|
|
Term
Given an SPSS printout for a between-groups t-test, determine if the t-value is significant. From this analysis, draw the appropriate research inference. |
|
Definition
Page 5- column 't' The t-value is significant, at the .000 level. "Males have significantly higher self-esteem than females. |
|
|
Term
Given an SPSS printout for an ANOVA, determine if the F-value is significant. From this analysis, draw the appropriate research inference. |
|
Definition
Page 6, middle table, column 'F' The F-value is significant at the .05 level. There is a significant difference between the IV's in this study. |
|
|
Term
Given an SPSS printout for post hoc contrasts, interpret the post hoc contrasts for the significant F and draw the appropriate research inference. |
|
Definition
Page 6, bottom chart, Column 'Sig.' "Participants in Treatment B display significantly lower levels of self-esteem than those in the control group." |
|
|
Term
In a 2 x 3 x 2 design, how many factors are there? How many levels in each factor? How many cells are involved? In a 3 x 3 design? |
|
Definition
In a 2 x 3 x 2 design, there are three factors. The first factor has two levels, the second has three, and the third has two. This study would have 12 cells. A 3 x 3 design would have two factors with three levels each, and 9 cells. |
|
|
Term
Discuss what it means for the main effects to be qualified by the presence of a significant interaction. Give an illustration. |
|
Definition
A main effect is the isolated impact of each factor on the dependent variable. Interactions occur when the change in the DV is dependent on two or more IV's. One can have a main effect between variables, but in the presence of a significant interaction, the main effects are qualified. An interaction trumps a main effect because it changes how you understand the main effects. In the insult example, there were main effects across "insult" as well as "region", but there was also an interaction, which trumped the main effect. |
|
|
Term
Given an SPSS printout for a factorial ANOVA, identify the cell means, marginal means, main effect significance tests, and interaction significance tests. Be able to interpret these analyses in light of a set of experimental hypotheses. |
|
Definition
Cell means= 2nd chart, "mean square" (?) Marginal means= 3rd chart, "Mean" ME Sig Tests= 2nd chart, 3rd 4th row under "Sig" Interaction sig. test= 5th row under "sig" |
|
|
Term
Describe the research situation when a MANOVA would be used (in contrast to an ANOVA). |
|
Definition
It would be used when there are multiple DV's (aka: you want to see what factors cause students to drop out of school) |
|
|
Term
Describe how MANOVA is considered by some to control for Type 1 error (Why not just run multiple ANOVA's?) |
|
Definition
Because for each ANOVA you run, you weaken the alpha level. The MANOVA looks for differences across all DV means simultaneously, as a set instead of individually. This keeps you from running multiple ANOVA's and, therefore, multiplying your risk of error. |
|
|
Term
Describe the interpretation of the Lambda statistic. |
|
Definition
Lambda is the proportion of variance among the DVs left unexplained by the IV. Lambda looks for differences of means at the same time--or, it looks at "aggregate" groups to find differences between them. |
|
|
Term
Describe how a significant lambda is to be followed up/explored. |
|
Definition
If the lambda is significant, then we can run one-way ANOVAs to see which DVs the groups differed on. If you have three or more groups in the IV, then each significant ANOVA should be followed up with post-hoc contrasts. |
|
|
Term
Compare and contrast MANOVA and Discriminant Function Analysis. |
|
Definition
Nominal IV and Numerical DV-- MANOVA Numerical IV and Nominal DV-- DFA A DFA is kind of a "reversed" MANOVA. MANOVA uses mean differences across multiple numberican variables, while the DFA uses scores to classify nominal data. |
|
|
Term
Given a MANOVA SPSS printout, locate and interpret the Lambda statistic. |
|
Definition
Page 9-- bottom chart. Wilks' Lambda is in the left column, second row of the 'intercept'. Lambda is significant, so follow-up with univariate ANOVA follow-up test. |
|
|
Term
Describe the purpose of factor analysis |
|
Definition
The point of factor analysis is for data reduction and to identify redundancies in the data |
|
|
Term
Describe how factor analysis aids in exploring the dimensionality of a scale/test |
|
Definition
* It helps determine if what we are measuring is unidimensional or multidimensional *Unidimensional scales assess one thing- all items tap one factor *Multidimensional scales assess many things- items load on one of many factors |
|
|
Term
|
Definition
underlying dimension identified in a factor analysis |
|
|
Term
|
Definition
Value used to show the amount of variance the factor is accounting for |
|
|
Term
Define 'scree curve/plot' |
|
Definition
A plot of the factors against their respective eigenvalues |
|
|
Term
|
Definition
Proportion of variance an item shares with all extracted factors |
|
|
Term
|
Definition
correlation between an item and a factor |
|
|
Term
Define 'marker variables' |
|
Definition
Most highly correlated item within a factor |
|
|
Term
Define 'simple structure' |
|
Definition
Occurs when items load strongly on one factor and are relatively uncorrelated with other factors |
|
|
Term
Define 'split factor loadings' |
|
Definition
Occurs when a variable is equally correlated with 2 or more factors |
|
|
Term
Given an SPSS factor analysis printout, how many factors were retained, what were their eigenvalues, and how much variance did each account for? By retaining these factors, how much variance was left unexplained? |
|
Definition
Chart labeled "total variance explained"--
List components (above the elbow)
Give their eigenvalues (total column)
Give the % of variance for each (% of variance column)
Then locate the cumlative % of variance for the compents and subtract from 100 to find % that is unaccounted for |
|
|
Term
Given SPSS printout for factor analysis, examine the ROTATED factor loadings, and identify the marker variables for each factor. |
|
Definition
highest for each variable |
|
|
Term
Given an SPSS factor analysis printout, examine the factor loadings for each factor and determine what the factor is by measuring and labeling each factor. |
|
Definition
The factor is measuring 'making-fun-of-self' (label: self deprecating humor) |
|
|
Term
Given an SPSS factor analysis printout, identify an item that has a split-factor laoding |
|
Definition
Factor (faces 10) .452 on component 1 and .432 on component 2 |
|
|
Term
|
Definition
organizes, summarizes, and describes the characteristics of a data set by compression
examples: mean, SD, frequency |
|
|
Term
|
Definition
- makes inferences from a smaller group of data (sample) to a larger one (population)
- drawing inferences about data set of the null hypothesis
- quantifies relationship of x (IV) & y (DV)
examples: correlation, t-test, anova
|
|
|
Term
|
Definition
the arthimetic average of a data set |
|
|
Term
|
Definition
the average deviation from the mean--that is the average amount of variability in a set of scores |
|
|
Term
What are the features of Z-Score Distribution? |
|
Definition
- approximental 68% of the scores fall within +/- 1 SD of the mean
- approximently 28% of the scores fall within +/- 2 SD of the mean
- approximently 4% makes up the tail ends
Bell Curve:
-2 = two SD units BELOW the mean
-1 = one SD unit BELOW the mean
0 = the mean
+1 = one SD unit ABOVE the mean
+2 = two SD units ABOVE the mean |
|
|
Term
Describe the Metric Problem & how Z-Scores help overcome this problem? |
|
Definition
- the Metric Problem occurs when trying to compare different metrics that have different units.
- the Z-score standardizes the scores to which allows different metrics to be combined so a composite can be created.
|
|
|
Term
Given a Mean & SD, describe the nature of the distribution (where 68% or 96% of the distribution is located) |
|
Definition
Example:
Mean = 10; SD = 2
Range for 68% = 8 to 12
(simply add/subtract SD from Mean)
Range for 96% = 6 to 14
(simply double SD and add/subtract from Mean)
|
|
|
Term
Given a Z-Score, describe the relationship to the rest of the distribution...
(how far above or below are they from mean in SD units?) |
|
Definition
Example:
Subject #1's Z-Score = 1.25
This Z-Score indicates the subject falls 1.25 SD units ABOVE the mean |
|
|
Term
|
Definition
- the degree of risk you are willing to take that the null hypothesis will be rejected when it is actually true
- If there is Statisitcal Significance it is unlikely due to chance
|
|
|
Term
|
Definition
probability of rejecting the null hypothesis when it is true
(means the results ARE DUE to chance)
Alpha & False Positive
|
|
|
Term
|
Definition
probability of failing to reject the null hypotheiss when it is false
(means the results are NOT DUE to chance)
Beta & False Negative |
|
|
Term
|
Definition
the abilitity to reject the null hypothesis when it should be rejected
(the larger the sample, the stronger the power)
1-Beta |
|
|
Term
Describe when a statistic is Statisitically Significant & what it means? |
|
Definition
A statistic is said to be Statisitically Significant when the p-value is found to be less than the specified level of significance
(usually p < 0.05)
What this means is that the probablity of the finding was unlikely due to chance, therefore ruling out the null hypothesis as a plausible explanation |
|
|
Term
What is one way to increase the power of a statistical test? |
|
Definition
increase Sample Size
or
increase Effect Size |
|
|
Term
Describe features of the Correlation Coefficient |
|
Definition
- Only used with numerical data to reflect the relationship between 2 variables
- It assess the trend & strength of the linear relationship between 2 variables
|
|
|
Term
Describe the nature of the relationship between two variables and state if the relationship is statistically significant
(using SPSS printout) |
|
Definition
Start by checking the Pearson Correlation row of the variable in the Correlations table.
Example:
Comparing Depression to Hopelessness & Self-Esteem the Correlations table indicates the Pearson Correlation for Hopelessness is .848 with significance of .000 & for Self-Esteem is -.492 with signicance of .060.
Interpretation of Example:
*The correlation between Depression & Hopelessness is Positive with a Strong strength at .848 and it Is Significant at the .000 level.
*The correlation between Depression & Self-Esteem is Negative with a Moderate strength at .492 and it Is Not Significant.
|
|
|
Term
Calculate the Proportion & Percent of Shared Variance between variables
(given the correlation coefficient) |
|
Definition
In the Correlations table, use the Pearson Correlation row of the variable to identify the correlation value, then Square that number to find the Proportion/Percent of Shared Variance
Example:
Comparing Depression to Hopelessness & Self-Esteem the Correlations table indicates the Pearson Correlation for Hopelessness is .848
Interpretation of Example:
The Proportion/Percent of Shared Variance b/w Depression & Hopelessness = .72 or 72%
*note: R2 in your Model Summary table will also give you this information, without having to square it yourself. |
|
|
Term
Describe the Partial Correlation between variables when controlling for a covariate in words...
(given SPSS printout) |
|
Definition
Using the table, start by identifying what variable is being controlled for. Next, locate the Corrleation & Significance values.
Example:
Coviariate (control) Variable = Self-Esteem; Other Variables = Depression & Hopelessness. In the table, Depression & Hopelessness indicate a correlation of .637 & significance of .003
Interpretation of Example:
*The correlation between Depression & Hopelessness is .637 when controlling for Self-Esteem, and is significant at the .003 level
|
|
|
Term
Describe the ideas behind Partial Correlation Coefficient
(what does it mean when something is being controlled for?) |
|
Definition
It examines the relationship between 2 variables by controlling shared variance with a 3rd variable; it is removing the influence of the controlled variable in the correlation. |
|
|
Term
What is the Regression Equation used for? |
|
Definition
It is used to look at the relation between 2 variables & it allows the prediction of an unknown variable based on the value of a known variable |
|
|
Term
Difference between Simple Regression & Multiple Regression |
|
Definition
Simple Regression:
one predictor predicting one criterion
Multiple Regression:
two or more predictors predicting one criterion |
|
|
Term
Write & Identify each part of the
Multiple Regression Equation |
|
Definition
y' = B1x + B2z + B3w + B0
B1 , B2 , B3 = slope of the line
(coefficients table > 1st, 2nd, 3rd variable name row > model column)
B0 = y-intercept
(coefficients table, Constant row, B column)
y' = criterion
X, Z, W = predictors
(x,z,w will be given) |
|
|
Term
Where to find Correlation between Observed & Predicted scores in Simple/Multiple Regression data?? |
|
Definition
Model Summary table > R column |
|
|
Term
Where to find Proportion/Percent of Variance predictors were accounting for in Y in Simple/Multiple Regression data?? |
|
Definition
Model Summary table > R Square column |
|
|
Term
Where to find if the overall prediction was Signifcant for Simple/Multiple Regression data? |
|
Definition
|
|
Term
Where to find Y-Intercept in Simple/Multiple Regression data? |
|
Definition
Coefficients table > Constant row > B column |
|
|
Term
Where to find Raw-Score Beta Weights for each predictor in Simple/Multiple Regression data?
Interpret into words... |
|
Definition
Coefficients table > Model column > locate varaiable name > find value of variable under B column
If variable A goes up a point, variable B is predicted to go up [insert value from B column here] points. |
|
|
Term
Where to find the Z-Score beta weights for each predictor in Simple/Multiple Regression data?
Interpret the data in words... |
|
Definition
Coefficients table > Model Column > locate variable names > locate value of variable in Beta column
If variable A goes up a SD, variable B is predicted to go up [insert value from Beta column here] SDs. |
|
|
Term
Where to identify which predictors make significant contribution to the predtion and which do not in Simple/Multiple Regression data |
|
Definition
Coefficients table > Sig column |
|
|
Term
Describe how Beta Weights are similiar to Partial Correlation Coefficients |
|
Definition
standardized beta weights can be interpreted roughly like partial correlation because both of them, in a sense, control for the presence of a covariate. |
|
|