Term
|
Definition
effect of 2 or more independent variables (factors) on a dependent variable |
|
|
Term
|
Definition
effect of one independent variable on dependent variable (regardless of other IV) |
|
|
Term
|
Definition
effect of IV1 on DV depends on the level of IV2 and vice versa |
|
|
Term
Factorial Totally between design |
|
Definition
both factors are between participants
Group 1 = salt,small Group 2 = Salt, large
Group 3 = sugar small Group 4 = sugar large |
|
|
Term
Factorial totally within design |
|
Definition
both factors within
Group 1 = salt small and salt large, sugar small and sugar large |
|
|
Term
Factorial mixed design (Split plot) |
|
Definition
one factor is within and the other is between
Ex: if sauce type is between then each group will get either salt or sugar but if within then each group will get both
if amount is between then each group will get either large or small but if within then each group will get both |
|
|
Term
Types of Observational Studies |
|
Definition
Naturalistic OR participant
Covert or Overt and Disguised or undisguised
|
|
|
Term
|
Definition
researcher just watches, not interact
if overt: desensitization (slowly get participants accustomed to your presence by moving closer and closer)
or Habituation (show up frequently in vicinity and they will slowly stop reacting) |
|
|
Term
Participant Study/Observation |
|
Definition
researcher interacts with observees
either disguised or undisguised |
|
|
Term
Types of confounds in Observational study |
|
Definition
Reactivity = presence/knowledge of observer influences subject's behavior
Observation bias = biased observations on the part of the observer (fix by having multiple observers) |
|
|
Term
|
Definition
Observation notes = take asap so less memory contamination
Recording = audio, video; typically broad and unstructured data will need
data reduction = recode |
|
|
Term
|
Definition
Static = info that stays stable throughout observation session
Ex: age, gender, setting
Action = looking for specific behaviors; each action has own operational definition |
|
|
Term
Instrumentation effect in Observational Study |
|
Definition
human observer's coding can change over time (early v. late observations) |
|
|
Term
|
Definition
Time sampling = sampling the times at which observations are made
systematically (every hour etc) or randomly (random chosen time points or intervals)
Event sampling = sample the events that include the behavior of interest
systematically or randomly |
|
|
Term
|
Definition
sample a variety of settings and circumstances where behavior of interest occurs
Ex: observe play behavior at multiple playgrounds
Ex: observe teaching at multiple schools |
|
|
Term
|
Definition
Doing an experiment in a naturalistic setting
Difference from observational = researcher is actually manipulating a variable
Ex: instead of observing how primates share food, researcher manipulates where food is etc |
|
|
Term
|
Definition
Lab experiments < field experiments < observational studies |
|
|
Term
Survey use in Behavioral research |
|
Definition
use them before or after experiment
controlling for extraneous variables
checking for manipulation
interpreting the data better
Most common: Exit survey and stimuli norming |
|
|
Term
Written Surveys Pros and Cons |
|
Definition
Pro = can reach large number of people easily, good for gaining sensitive info
Con: can't clarify questions and can't verify the identity of the respondents |
|
|
Term
|
Definition
pro = can reach large # of people and wide sets of potential respondents
Con = low response rate ( could raise a bit by incentives ) |
|
|
Term
Group administered surveys
pro and con |
|
Definition
Pro = high response rate
Con = time constraint |
|
|
Term
Internet surveys pro and con |
|
Definition
Pro = cheap and fast
con = can suffer from low response rate (can raise by making it a paid survey) |
|
|
Term
|
Definition
Pro = better choice if ability to clarify questions is important; higher response rate
con = greater risk of interviewer bias |
|
|
Term
|
Definition
Landline? cell?
low response rate |
|
|
Term
Personal interview pro and con |
|
Definition
pro = can verify identity of respondent; high response rate
con = can't assure anonymity so more likely to produce socially desirable answers; expensive and time consuming |
|
|
Term
Open ended questions v. closed questions |
|
Definition
open = rich information but hard to analyze, often need to do data reduction
closed = yes/no, multiple choice, rating scale, forced answers |
|
|
Term
what kind of questions to avoid |
|
Definition
leading, loaded, doubled barred, negatively worded |
|
|
Term
|
Definition
change's people responses
a question whose wording includes information that leads people to a particular response |
|
|
Term
|
Definition
question that includes terms that are emotionally laden and not neutral |
|
|
Term
|
Definition
asking two questions in one |
|
|
Term
Response sets (non differentiation) |
|
Definition
adopting consistent way of answering all questions especially towards end of a long survey
answer all positively, negatively, or neutrally |
|
|
Term
|
Definition
"yea" saying
people say yes or or strongly agree to every item |
|
|
Term
|
Definition
answering in the middle of the scale
how to avoid: take away neutral option |
|
|
Term
Socially desirable responding |
|
Definition
faking good (sometimes faking bad)
how to avoid/lessen? anonymous surveys |
|
|
Term
Implicit Association Test |
|
Definition
Evaluate people’s opinions about sensitive topics
Strength of associations between concepts (e.g., black
people, gay people, old people) and evaluations (e.g.,
good, bad) or stereotypes (e.g., athletic, clumsy)
People’s response speed is unconsciously
influenced by the task-irrelevant classification
categories |
|
|