Term
What happens to S y.x (st. error of the regression line) and variance of Y-hat when: r gets stronger |
|
Definition
S y.x: decreases (good correlation -> less error) variance of Y-hat: increases (can differentiate better with a higher correlation: who is going to be liberal vs. who is not going to be liberal) |
|
|
Term
What happens to S y.x (st. error of the regression line) and variance of Y-hat when: r gets weaker |
|
Definition
- S y.x: increases (weaker correlation -> more error) - variance of Y-hat: decreases (can't differentiate as well with a weaker correlation) |
|
|
Term
What happens to S y.x (st. error of the regression line) and variance of Y-hat when: r = -1 or +1 |
|
Definition
S y.x = 0 variance of Y-hat = variance in the actual Y-scores |
|
|
Term
What happens to S y.x (st. error of the regression line) and variance of Y-hat when: r = 0 |
|
Definition
- S y.x = S y - variance of Y-hat = 0 (everyone is the same, can only predict mean of Y) |
|
|
Term
|
Definition
a statistical phenomenon that occurs whenever r < 1, for predictor scores: more moderate the next time around |
|
|
Term
When is regression to the mean the most severe? |
|
Definition
when you have a very weak correlation |
|
|
Term
What impact can regression to the mean have in the interpretation of experimental results? |
|
Definition
when studying an extreme group, regression to the mean would predict a more moderate score in the retest - thus, you might find a tx effect when regression to the mean was the actual cause in the change in the scores |
|
|
Term
What are the 3 steps in Regressions research |
|
Definition
1. Derivation phase 2. Application Phase 3. Cross Validation |
|
|
Term
Describe the first phase in Regression Derivation |
|
Definition
collect predictor info, wait for criterion info, derive regression formula (and r or R and S y.x) |
|
|
Term
Describe the second phase in Regression research: Application phase |
|
Definition
intent: to derive regression equation collect predictor info on new sample, apply old regression weights, predict outcome for this new sample |
|
|
Term
Describe the third phase in Regression Derivation: Cross Validation |
|
Definition
- see how well predictor works on the new sample - shrinkage always happens: never fits the 2nd group quite as well |
|
|
Term
What is the goal of multiple regression |
|
Definition
identify optimal combination of predictors to maximize the accuracy of prediction |
|
|
Term
Advantages of using multiple regression (3) |
|
Definition
1. more realistic 2. each predictor counteracts the weaknesses in the others 3. greater accuracy (reduction in error) |
|
|
Term
What are 2 considerations when selecting a second predictor in multiple regression? |
|
Definition
1. want low correlation with the other predictor: don't want IV's to overlap 2. want high correlation with DV * want IV's to uniquely capture as much variance in DV as possible |
|
|
Term
|
Definition
- b: with raw scores, is the practical method/use individuals for - beta weights: standardized weights, are directly comparable, can compare the relative strength of predictors |
|
|
Term
|
Definition
correlation based on more than one predictor def: correlation between predicted outcome and actual outcome - reflection of how well we did: how well our predictors match reality |
|
|
Term
|
Definition
- proportion of variance in criterion that can be accounted for by a combination of Predictor 1 and Predictor 2 - how well can we predict some outcome by a combination of predictors |
|
|
Term
Key features of the sampling distrib for r |
|
Definition
- center: zero - variability: S_r - shape: normal is the null hyp is that rho is 0 (if the null hyp is that rho does not equal zero, distrib gets highly skewed in the extremes b/c of the ceiling and floor effect) |
|
|
Term
What happens when the null hyp is that rho is not zero |
|
Definition
the distrib is no longer normal, so an adjustment is needed, the Fisher r -> z conversion makes that adjustment |
|
|
Term
What is meant by a global test? |
|
Definition
a significant F test tells you that there is some difference among those means (some difference somewhere) |
|
|
Term
Key features of post-hoc tests (3) |
|
Definition
1. only done when global F is significant 2. Pairwise comparisons 3. build in safeguards for alpha |
|
|
Term
Bonferroni: how does it adjust alpha, how conservative? |
|
Definition
- adjusts alpha based on the # of extra comparisons: new alpha is .05/# of extra tests - highly conservative - many applications: t tests, correlations |
|
|
Term
how does the Tukey HSD test safeguard for alpha? do you need equal n? how conservative |
|
Definition
- raises the threshold for sig by the "q" multiplier - need aprrox equal n for hand calculations - less conservative |
|
|
Term
Schaffe" how does it safeguard alpha? how conservative? |
|
Definition
- very similar to ANOVA F calculations - multiple steps in the conservative dir. (df, error term, critical F) - intermediate level of conservatism |
|
|
Term
significance vs measures of magnitude |
|
Definition
- significance only tells us if it was sig or not, black and white - measures of magnitude tell us the sized of the tx effect (Cohen's d: in s.d. units) - r squared and eta squared are percents: proportion of variance accounted for |
|
|
Term
how do sampling distributions serve as the foundation for hyp testing? |
|
Definition
sampling distrib tells us what we would expect to happen by chance. without it, we wouldn't know how chance behaved so we not have a way to test if our sample was significantly different from chance |
|
|
Term
|
Definition
- a 1 x 2 ANOVA is the same as doing a 2 group t-test - F = t squared |
|
|
Term
some differences between ANOVA and correlation/regression: focus,type of IV |
|
Definition
focus: for ANOVA is significance testing on means, for correlation/regression: making predictions and determining relationships - IV: for ANOVA are categorical and for correlation/regression the IV is continuous |
|
|
Term
When is doing linear regression the best way to go |
|
Definition
when you have an IV and DV that are both on a continuum and you want to make a prediction |
|
|