Term
Descriptive Research Stategy- Measuring variables as they exist naturally |
|
Definition
Not examining a realtionship Useful for preliminary research Purely "descrptive" |
|
|
Term
Descriptive Research Strategy- Observational Research |
|
Definition
Naturalistic Observation Structred Observation Case Study Survey Archival Research |
|
|
Term
|
Definition
Assigns numerical values to reponses and measures -Focuses on specific behaviors that can be easily quantifed -Use large samples -Use subject to the data statistical analyses |
|
|
Term
|
Definition
Data are non-numerical and expressed in language and/or images -Focuses on behavior in natural setting -Small groups and limited setting -Describe or capture themes that emerge from data |
|
|
Term
Observational Research Methdods - Purpose |
|
Definition
To gather data on which further studies can be based -Disprove theories -Not experiements - designed to be descriptive, no attempt to manipulate environment, descriptive research cannot predict an event, systematic recording of naturally occuring events/behaviors We tend not to use naturalistic observation (thoughts cannot be observed) |
|
|
Term
Naturalistic (Field) Observation |
|
Definition
The observation of behavior in a natural setting as unobtrusively as possible Strengths- Behavior observed in real world, useful for non-manipulated behaviors, actual behaviors observed and recorded
Weakness- Time consuming, potential for observer influence, potential for subjective interpretation
Walking and bumping into eachother on the sidewalk |
|
|
Term
|
Definition
Engages in the same actitivies as the people being observed in order to observe and record behavior "Being sane in insane places" Strengths- When natural observation is possible, get info not accessible otherwise, participation gives unique perspective
Weakness- Time consuming, potential lost of objectivity, Increased chance for observer influence |
|
|
Term
Structured (Contrived) Observation |
|
Definition
Settings arranged specifically to faciliatet the occurence of specific behaviors -Strange Situation Strengths- do not have to wait for behaviors to occur Weakness- Less natural |
|
|
Term
|
Definition
In depth study and detailed descrption of a single individual (or small group) |
|
|
Term
Applications of the Case Study |
|
Definition
Rare phenomena and unusual clinical cases (Genie) Case studies as counterexamples Physchobiography- applies psycholoigcal theory to explain an indvidual |
|
|
Term
Case Study Design -Strengths |
|
Definition
Strengths- Not averaged over a diverse group, detailed descritption, vivd/powerful/convincing, compatiable with clinical work, can study rare and unusal events, can identify excpetions to the rule |
|
|
Term
Case Study Design- Weaknesses |
|
Definition
Limited generalization, potential for selective bias, potential for subjective information |
|
|
Term
|
Definition
Gathers detailed self-reported information from a large number of individuals |
|
|
Term
Inidrect Observation of Behavior |
|
Definition
Arichival Research- Measuring behaviors from historical records |
|
|
Term
|
Definition
Observation based on existing info Strength- Can address questions that can be addressed in no other ways Weaknesses- Difficult to obtain, cannot be sure of accuracy of records |
|
|
Term
Running an Observational Study |
|
Definition
Decide on what exactly you are going to observe Decide a strategy or method to collect information Establish ways of measuring your reliability Decide how specific you want the info. to be |
|
|
Term
|
Definition
Continuous descprition of what the observers sees during the course of events -Costly and time consuming -Results in a huge amount of data -Least control for observer bias |
|
|
Term
Interval Method (Time Sampling) |
|
Definition
The entire observation period gets broken down into intervals of observation and intervals of recording -Used when you know a behavior will occur repeatedly -Reduces how much info. you lose due to fatigue -Minimes errors due to forgetting -Recording is usally categorical, thus very simple |
|
|
Term
|
Definition
Measure the frequency of a behavior in a given time period |
|
|
Term
Duration Method (Event Sampling) |
|
Definition
Wait for the event to occur then start recording the length of the episode |
|
|
Term
|
Definition
Behaviors must not be influenced/disturbed Some degree of subjective interpretation Measure Relability |
|
|
Term
Observational Research Strengths |
|
Definition
Terrific source of hypotheses for experiments Naturalistic- recording actual behavior High external validity Flexible Great method for collecting data |
|
|
Term
Observational Research Weaknesses |
|
Definition
Ethical concerns Reactivity- people know they are being watched and may change their behaviors Humans are fallible observers- we miss a lot Description is open to bias |
|
|
Term
Possible Issues in Observation |
|
Definition
Observer bias Confirmation bias Casual bias Observer drift- observers way of observing may shift over the course of investigation Sampling- we cannot observe everything we may miss imporrant behaviors Inter rater reliability is a measure of observational quality |
|
|
Term
|
Definition
Construct- Aggression Operational Definitions- behavioral observation, parent or teacher ratings,peer nominations, test situations, visits to headmaster |
|
|
Term
|
Definition
Self Report (easiest) Physiological (PET scan, MRI)-Expensive/ unnatural settings Behavioral- Measure the ways a construct reveals itself |
|
|
Term
|
Definition
Label and categorize observations with no quantitiave distinctions -Colors, political party, major, gender |
|
|
Term
|
Definition
Categories organized in an order sequence terms of size of magnitude -Year in school, place in a race, 2-3-4 star restaurant |
|
|
Term
|
Definition
Ordered categories that are all intervals of exactly the same size, arbitrary 0 -Farenheight temp (0 is not absensece of temp. can be lower), golf scores (above or below par) |
|
|
Term
|
Definition
Interval scale with an absoulte 0 (absence of variable), meaningful ratios -Height, weight, reaction time, freqency of behavior |
|
|
Term
|
Definition
Pros- Allows a more complete measure of the construct in question Cons- Desynchrony (measure may not be equal- measuring different things) may cause difficulty interpreting results of data analysis |
|
|
Term
Sensitivity and Range Effects |
|
Definition
Measurement tools must be sensitive enough to record changes when changes are actually there (as perfect as possible) Ceiling effect- cluster of scores at high end (test too easy) Floor effect- scores at low end (test too hard) |
|
|
Term
Relationship between reliability and validity |
|
Definition
Reliability indices do not indicate wheather a particular measure is an accurate measure of the variable Can have reliability without validity Cant have validity with no reliability Precision-Degree of reliability Accuracy-Degree of Validity |
|
|
Term
|
Definition
The concent of the measure appears to reflect the construct being measured |
|
|
Term
|
Definition
The content of the measure is linked to the universe of content that defines the construct |
|
|
Term
|
Definition
The measure of a construct accuratly predicts a hypotheseized behavior (SAT- how well youll do in college) |
|
|
Term
|
Definition
How well a measure performs concurrent with another established measure -Give already estabished test with new test- if the results are similar it would be valid |
|
|
Term
|
Definition
The measure of a consturct is related to the measure of the same construct |
|
|
Term
|
Definition
The measure ofa construct is NOT related to other measures that are theoretically different |
|
|
Term
|
Definition
Interrator- If you have two observers watching the same behavior, their scores should agree with each other Internal Consistency- Within a test, people should respond in a consistent wat to all of the questions Test-retest- If you give a test to a person more than once, they should get about the same score each time |
|
|
Term
|
Definition
Split items in a questionnaire in half, score each half, then calculate the consistency between two scores for a group of participants |
|
|
Term
|
Definition
Compare every item on the test with every other item on that test |
|
|
Term
|
Definition
Compare performance on each item to performance on complete measure |
|
|
Term
Ways to increase reliability |
|
Definition
Include multiple measures for each construct Include multiple items within each measure Have a good operational definitions |
|
|
Term
|
Definition
The individual making the measurments can introduce simple human error |
|
|
Term
|
Definition
Small changes to the environment that differ from one measurment to another time of day/ lighting |
|
|
Term
|
Definition
Less reliability will have a flatter and wider distribution More reliability wont be as spread out- scores clustered around mean |
|
|
Term
Every Measurment Includes |
|
Definition
True Score= Measurment of real value on a variable Error= Incorrect responses caused by chance or poor test design |
|
|
Term
|
Definition
The degree to which the measurment process measures the varibales it claims to measure |
|
|
Term
|
Definition
The stability or consistency of the measurement |
|
|
Term
|
Definition
Characteristics or conditions that change or have different values for different individuals -Anything that can be measured -An abstract concept that must be translated into concrete forms of observation or manipulation |
|
|
Term
Two important aspects of measurement |
|
Definition
Often there is not a one-to-one relationship between te variable and the measurments obtained Different methods may be used to measure the same variable, which may lead to different findings |
|
|
Term
|
Definition
Describe characteristics of the situation of environment Temp (hot/cold), location (suburban/urban) |
|
|
Term
|
Definition
|
|
Term
Participant or Subject Variables |
|
Definition
Decibe characteristics of the participant or subject -Gender, age, ethnicity |
|
|
Term
|
Definition
Bridges on variable to another -It is influenced by one variable an influences another |
|
|
Term
Threats to external validity |
|
Definition
Any characteristic of a study that limits the generalization of the results -From a sample to general populatiion -From one research study to another -From a research study to a real-world situation |
|
|
Term
Three Categories of Confounding Variables |
|
Definition
General ( All Studies)- Environmental variables Group Related (Studies comparing groups)- Assignment bias Time-Related (Studies comparing one group over time) |
|
|
Term
Threats to Internal Validity |
|
Definition
Any factor that allows for an alternative explanation -Extranous Variables- Any vairble other than those specifically being studied -Confounding Variables-Any extraneous variable that changes systematically along with the 2 variables being studies |
|
|
Term
|
Definition
The extent to which the operational definition of a variable reflects the true theoretical meaning of the variable |
|
|
Term
|
Definition
Research produces a single, unambiguous explanation for the relationship between two variables |
|
|
Term
|
Definition
Extent research can be generalized to people, settings, times, measures, and characteristics than those used in the study |
|
|
Term
|
Definition
The truth of the research/accuracy of the results Does the study accurately answer the question it was intended to answer? |
|
|
Term
|
Definition
The extent to which your conclusions are correct Statistically” significance, power, meaning of test utilized Logically: are you over interpreting your results? |
|
|
Term
|
Definition
Any component of a research study that introduces questions or raises doubts about the quality of the research or accuracy of the research results |
|
|
Term
Critically Evaluating Research |
|
Definition
Construct Validity- Evaluate the adequacy of the operational definition Internal Validity- Evaluate the extend that it was the independent variable that caused the changes or differences in the dependent variable |
|
|
Term
Advantages of Multiple Methods |
|
Definition
Artificiality of experiments limits generalizability of the results Field experiments lack control, but provides natural context Ethical and practical considerations You cannot assign children into a “spanking” group Participant variables are nonexpermimental You cannot assign someone into a gender group Description of behavior A major goal of psychological science is to provide accurate descriptions of events |
|
|
Term
|
Definition
Inferences of cause and effect require three elements: 1. Temporal precedence 2. Covariation between two variables 3. Need to eliminate plausible alternative explanations |
|
|
Term
|
Definition
Compare difference on one variable between a grouping variable (two or more groups) Intended to answer cause-and-effect questions Rigorous control Random assignment to groups |
|
|
Term
|
Definition
Intended to demonstrate a relationship between variables, but cannot explain it No cause-and-effect No rigorous controls Potential for confounds No manipulation |
|
|
Term
The Operational Definition |
|
Definition
A procedure for measuring and defining a construct Specifies a measurement procedure (a set of operations) for measuring an external, observable behavior Using the resulting measurements as a definition and a measurement of the hypothetical construct Limitations Easily oversimplified Often influenced by extraneous, unknown factors |
|
|
Term
Method of Tenacity (Not scientific) |
|
Definition
Information is accepted as true because it has always been believed or because superstition supports it -Based on habit and superstitions -Mere exposé effect – More we are exposed to something, the more we tend to believe that (ex. Opposites attract) Problem: May not be accurate, and there is no method for correction |
|
|
Term
|
Definition
Illusion of a relationship between two things (people from small towns are so nice!) |
|
|
Term
|
Definition
Q= Question What is the main theoretical question being addressed? M=Methods Give a concise summary of the methods being utilized R=Results Give a concise summary of important results I= Implications Explain the implications of the reported results for the theoretical questions |
|
|
Term
|
Definition
Impersonal style: not a personal story should be written in objective tone Verb tense: use past or present perfect tense Avoid biased language: free of implied or irrelevant evaluation of groups |
|
|
Term
Method of Authority (Non-scientific) |
|
Definition
Information is accepted because you got it from an authority on the subject Problems Relies on assumed expertise of the person you asked We assume the expertise can be generalized to include your question Expert might be biased or influenced by subjectivity Unquestioned acceptance of “expert” knowledge |
|
|
Term
The Rational Method/ Rationalism (Non-scientific) |
|
Definition
Answers questions through logical reasoning (Philosophical) Premise statements- Describe facts or assumptions that are presumed to be true Arguments- logically combines presumptions to yield a conclusion |
|
|
Term
Problems with The Rational Method |
|
Definition
Premise statements may be untrue Not all possibilities may be considered Logical Fallacies 1. All psychologists are human 2. Some humans are women Conclusion- Therefore, some psychologists are women |
|
|
Term
The Empirical Method (Non Scientific) |
|
Definition
Uses observation or direct sensory experience to obtain knowledge Problems- Prior knowledge, expectations, feelings and beliefs can influence perception We often misperceive the world around us Often dangerous |
|
|
Term
|
Definition
1. To describe behavior 2. To predict behavior 3. To determine the causes of behavior 4. To understand or explain behavior |
|
|
Term
|
Definition
Describing Behavior 1.Careful Observations -Cunningham’s (1997) examination of judgments of physical attractiveness over time 2. Predicting Behavior -Regular observations that two events are systematically related to one another, it is possible to make predictions |
|
|
Term
|
Definition
1. Temporal Precedence 2. Covariation of the cause and effect 3. Rule out alternative explanations |
|
|
Term
|
Definition
Tries to answer fundamental questions about the nature of behavior Studies often address theoretical issues concerning phenomena such as cognition, emotion, learning, personality development, social behavior |
|
|
Term
|
Definition
Conducted to address issues in which there are practical problems and potential solutions Study results have immediate practical implications |
|
|
Term
|
Definition
Generalize from a small set of specific examples to the complete set of all possible examples |
|
|
Term
|
Definition
1. Science is empirical- we use firsthand observations to test hypotheses 2. Science is public- you share your findings, its not just for you 3. Science is objective- you shouldn’t have a personal vested interest |
|
|
Term
|
Definition
-Use of scientific sounding terms to substantiate claims that are not accurate or true Astrology Marketing ploys that claim to enhance memory or sex |
|
|
Term
-Characteristics of pseudoscience |
|
Definition
Hypotheses generated are typically not testable If scientific tests are reported, methodology is not scientific and validity is questionable Supportive evidence tends to be anecdotal or relies heavily on “authorities” or “experts” Claims tend to be vague and ignore conflicting evidence |
|
|
Term
Scientific Method 5 steps |
|
Definition
1. Observe behavior or other phenomena 1. Inductive Reasoning 2. Form a tentative answer or explanation (hypothesis) 1. Variables 3. Use your hypothesis to generate a testable prediction 1. Deductive Reasoning/ the rational method 4. Evaluate the prediction by making systematic, planned observation 1. The Empirical Method 5. Use the observations |
|
|
Term
|
Definition
Physical considerations Follow protocols and train researchers Monitor participant’s status and provide needed follow up care Recruit appropriate populations Psychological Considerations Remind participants of the right to withdraw Provide counseling or support and thoroughly debrief Social, Legal, Economic Protection |
|
|
Term
|
Definition
Pregnant Women Human fetuses Prisoners Children- Parents signs consent; child provide assent (an indication of willingness to participate) Failure to object is NOT assent Mentally disabled persons- Legal guardian signs consent, participant provides assent Economically or educationally disadvantaged persons |
|
|
Term
|
Definition
1. Resolving Ethical Issues 2. Competence 3. Human Relations 4. Privacy and Confidentiality 5. Advertising and other public statements 6. Record keeping and fees 7. Education and Training 8. Research and Publication 9. Assessment 10. Therapy |
|
|
Term
Institutional Review Board (IB) |
|
Definition
Headed by scientists/ nonscientists in any institution/ agency conducting research with human participants 1. Minimization of risks to participants 2. Reasonable risk in relation to benefits 3. Equitable selection 4. Informed consent 5.Documentation of informed consent 6. Data Monitoring 7. Privacy and confidentiality |
|
|
Term
|
Definition
Public Health” Study conducted from 1946-1948 in Guatemala Penicillin efficacy trials Injection of gonorrhea and Syphilis by U.S. Government researchers into 696 subjects Guatemalan prisoners, mental patients, women No knowledge or consent |
|
|
Term
The Miligram Study (1961) |
|
Definition
Will participants obey with orders that conflict with their personal conscience? -Confederate Leaner Learner mentions heart condition before start of experiment -Participant Told to play the role of the “teacher” “Leaner” must learn word pairs Administer increasing levels of shock for incorrect answer -Prompts used if resistant Please continue…. You have no other choice, you must go on |
|
|
Term
Tuskegee Syphilis Study (1972) |
|
Definition
“Public Health” study conducted in Alabama between 1932-1972 To study the natural progression of the disease 399 poor, rural African American men with syphilis recruited Free medical exams, free meals, and free burial Told they were being treated for “bad blood” Penicillin validated as effective treatment in 1947! |
|
|
Term
|
Definition
Compensation that is too attractive may: Blind participants to the risks or impair their ability to exercise good judgment Prompt participants to lie or conceal information Considerations must include participants: Medical, employment, and educational status Financial, emotional, and community resources Therapeutic Misconception The tendency for participants to downplay or ignore the risks posed to their own well being due to their belief that their participation has been designed for their own benefit |
|
|
Term
Documentation of Informed Consent |
|
Definition
Form must contain: Overview and Purpose of the Study Descriptions of Procedures Potential Risks and Benefits Costs for Participation (usually none) Compensation Confidentiality (including limitations) Alternatives to Participating Voluntary Participation Contact info for Principal Investigator Form must be signed by the participant (or legal representative) |
|
|
Term
|
Definition
Three fundamental aspects of informed consent 1 Voluntariness- Consent must be freely given or truly voluntary 2. Comprehension- Individuals must have the mental capacity to understand the information presented to them 3. Disclosure- Researchers must disclose all aspects of the study |
|
|
Term
APA Ethics Code (2010) 5 Basic Principles |
|
Definition
1. Benefience and Nonmalefience- Do not harm, maxamize benefits, and minimize harm 2. Fidelity and Responsibiliity- Protecting the privacy/confidentiality/Anonymity 3. Integrity- Full disclosure of any conflicts of interest, only use deception when there is no other method for aquiring unbias data (must debrief) 4. Justice- Individuals and groups must be treated fairly, and equitably in terms of bearing the burderns and recieving the beenfits of research 5. Respect for Peoples Rights and Dignity-Individuals should be treatd as autonomous agents (independent person) |
|
|
Term
|
Definition
Respect for persons Benefience Justice |
|
|