Term
|
Definition
research is objective and can be measured |
|
|
Term
if a researcher is questioning whether or not a study is actually addressing their research question/hypothesis,what's a researcher questioning? |
|
Definition
|
|
Term
Dana hypothesizes that the most fashion magazines someone reads the less positive they are about their body image. what type of hypothesis is this? |
|
Definition
|
|
Term
hypothesis is a specific, treatable statement about a single variable? t/f |
|
Definition
|
|
Term
Which qualitative methods is the quickest to conduct? |
|
Definition
|
|
Term
If a distribution follows a normal curve, the mean median and mode would all be different values. T/F |
|
Definition
|
|
Term
|
Definition
research should be used in all aspects of the processes |
|
|
Term
|
Definition
1. Define the problem 2. planning the program 3. implementing the program 4. evaluating the problem |
|
|
Term
Defining the Problem - PR |
|
Definition
1. environmental monitoring (look into environment) 2. Public relations audits ( broad internal & external) 3. communication audits: identify subjects (internal & external publics) 4. social audits ( docs being created, surveys very specific, single issue) |
|
|
Term
|
Definition
interpret data gained in problems definition phase
what needs to be done what media need to be used |
|
|
Term
|
Definition
gate keeping out put analysis: help with evaluation how much is getting covered |
|
|
Term
_____refers to rsearch that help develop effective advertisments and determines which several ads is the most effective |
|
Definition
|
|
Term
in lecture, we defined "research" as a scientific quantifiable, attempt to discover something |
|
Definition
|
|
Term
one of the benefits in focus groups is the wealth fo quantitative data they typically provide (t/f) |
|
Definition
|
|
Term
|
Definition
under the radar marketing hire teens to log on to chat rooms to talk up bands recruiting college students to throw campus parties can back fire: sony |
|
|
Term
|
Definition
encourages individuals to pass on a marketing message to others uses existing social networks reach cost effecitve key:people want to share |
|
|
Term
|
Definition
reach vs frequency audience size and composition efficiency studies competitor research |
|
|
Term
New directions in advertising |
|
Definition
30 sec TV ad long standing king of ads losing effectiveness |
|
|
Term
new directions in advertising losing effectiveness because: |
|
Definition
DVR allow skipping of commercials-90% reg. skip ads prominent demographics are moving to more interactive forms of entertainment (vid games, internet) |
|
|
Term
|
Definition
video i-pod tive + other DVRs ad prices based on program ratings assumptions |
|
|
Term
creative solution product placement |
|
Definition
the paid placement of brands and products studies have found that the practice is fairly common in movies & tv mostly low involvement products (excluding cars) |
|
|
Term
video game product placement |
|
Definition
300m spent on in-game ads last year placements can range from $20k to $1M depending on performance/prominence |
|
|
Term
benefits for game publishers/gamers advertisers |
|
Definition
competitive interactive ad |
|
|
Term
|
Definition
pre-test/post-test key to define what your outcome/obj is |
|
|
Term
|
Definition
applied strategic evaluation basic |
|
|
Term
|
Definition
o It’s always been true • You eat turkey on thanksgiving o Tied to prior held beliefs o Beliefs are hard to change • Flat earth society o What if knowledge has changed but beliefs haven’t? • Women belong at home, not in the workplace • I never advertised before, why should I now? |
|
|
Term
|
Definition
o Someone (who should know) says so • Doctor diagnosis • Mommy says so o What if that person is wrong? o Again, hard to change and may not consider new information |
|
|
Term
|
Definition
o Truth is self-evident • Common sense o What if two individuals’ common sense tell them different things? • Politics • Religion |
|
|
Term
Problems with “Everyday” Ways |
|
Definition
• Filters how we process info o False premise; illogical reasoning o Selective observation; expectations • Everyday ways of knowing can even lead to conflicting ideas about “truth” o Absence makes the heart grow fonder o Out of sight, out of mind |
|
|
Term
|
Definition
• The 4th way of knowing • Requires systematic analysis • Always open to new info o Nothing is ever “proven” with science • Tests questions of fact, not questions of value o How do customers react to this advertisement? o Is it ethical to try to get people to buy things? |
|
|
Term
Characteristics of Scientific Method (1) |
|
Definition
• Public o Must be available to others to scrutinize o Ability to replicate • Must be very detailed o We cannot advance science if we don’t know what others are doing |
|
|
Term
characteristics of scientific method 2 |
|
Definition
• Objective o Procedures and rules must be followed o Try to be as free of bias as possible • In practice, this goal is impossible. Granting agencies, accessibility, current concerns, personal interests influence what is researched. |
|
|
Term
characteristics of scientific method 3 |
|
Definition
o Can be tested, measured • Conceptual (constitutive) definitions • Operational definition |
|
|
Term
characteristics of scientific method 4 |
|
Definition
• Systematic and cumulative o Research should be done with purpose o Research should build on what has been done in the past • Builds upon theory- related propositions that specify relationships among concepts • Also feeds back to the idea of “publicness” |
|
|
Term
characteristics of scientific method 5 |
|
Definition
• Predictive o Science should help predict what will happen in the future o This characteristic is highly debatable. Explanation, description and exploration are also common goals ***disagrees with the book*** |
|
|
Term
|
Definition
Select topic o Topic Relevance • Is the question too broad? • Can the problem be investigated? • Can the data be analyzed? • Is the problem significant? • Can the results be generalized? |
|
|
Term
|
Definition
Review: Theory/Literature |
|
|
Term
|
Definition
Develop Questions o Hypotheses v. research questions |
|
|
Term
|
Definition
statement of testable, expected relationship between two concepts (always a statement) Individuals who have higher levels of trait aggressiveness will be more likely to engage in political flaming |
|
|
Term
|
Definition
when the researcher is unsure of the relationship between two concepts (always a question) No theory/prior research to guide Is there a relationship between trait hostility and political flaming? |
|
|
Term
|
Definition
|
|
Term
|
Definition
• Survey • Experiment • Content analysis |
|
|
Term
|
Definition
• Focus group • Interviews • Case studies • Observations |
|
|
Term
|
Definition
• There is an objective reality • It can be measured • Given appropriate techniques, findings can be generalized to others o External validity: how well can the results be applied to other situations/people? • Sampling is key here |
|
|
Term
|
Definition
• Reality is socially constructed • Each person’s experience is unique, therefore cannot be generalized • Value is in understanding a situation deeply, rather than being able to apply it to other situations • In depth knowledge, emotion, determines how you phrase questions |
|
|
Term
|
Definition
• 5. Collect Data • 6. Analyze and interpret data o Internal Validity & Reliability • Internal validity: are you measuring what you think you are measuring? • Reliability: can you measure it consistently? • 7. Inform others |
|
|
Term
|
Definition
• Deduction • Theories • Hypotheses • Obervations • Empirical generalizations • Induction |
|
|
Term
|
Definition
• Basic vs. applied research • Proprietary information |
|
|
Term
Goals of Qualitative Research Research Paradigms |
|
Definition
• Positivism o Objective, measurable effects of media and prediction of media use • Interpretive o Understanding how individuals interpret events and create meaning • Critical o Examines power structures and their influence on media |
|
|
Term
Key Differences in the goals of research |
|
Definition
• Role of the researcher • Design • Setting • Measurement • Theory building |
|
|
Term
Purpose of Qualitative Research |
|
Definition
• Used by those who have an interpretive or critical paradigm • Goal is to develop rich understanding of peoples’ subjective experiences |
|
|
Term
“Trustworthiness” of Data |
|
Definition
• Qualitative research is NOT concerned with: o Reliability & validity of measurement o Internal & external validity • Instead, focus is on researcher interpretations o “credible” study (rings true) o “confirmable” (well-reasoned) conclusions o “transferable” ideas for reader |
|
|
Term
|
Definition
• Essentially a “focused” group interview • 6 to 12 participants • participants should be fairly homogenous, but should not know each other • discussion is recorded in order to be reviewed and transcribed later • researcher roles: o moderator o note takers |
|
|
Term
Steps involved in focus group |
|
Definition
• problem definition • sample selection • determine number of groups • prepare for study mechanics • prepare materials • conduct the session • analyze the data |
|
|
Term
|
Definition
• Researchers have identified that the role of social media is expanding, particularly in young adult/college-aged populations. Given the constantly evolving nature of social media in addition to it’s relative new appearance on the media scene, how it is being used and what impact it has on users is still relatively limited |
|
|
Term
|
Definition
• Texas tech university students o Gender? o Major? o Year in school? o Single? In a relationship? o Have a job? o What are they involved in? activities, organizations, etc o Dispersed family? Or all located in the same city? • Remember: need 6 to 10 participants, ideally they do not know each other |
|
|
Term
Determine number of groups |
|
Definition
• You are all required to do a single job • In other situations, many want multiple groups with different characteristics, depending on your research problem o Women vs. men, freshmen vs. upperclassmen |
|
|
Term
|
Definition
• Main research questions • Leading questions • Testing questions • Steering questions • Obtuse questions • Factual questions • “feel” questions • anonymous questions • silence |
|
|
Term
|
Definition
• start general , move to more specific • consider question importance re: research problem • number of questions varies, especially dependant on number of “probes” or follow up questions used o general ruse – no more than 12 main questions o initial questions should be unstructured – participants should feel free to respond in a variety of ways, use probes for more specific information |
|
|
Term
|
Definition
• Opening question: easy, factual question. Serves as ice breaker • Introduction questions: general impressions of the topic • Transition questions: guide participants toward key topic • Key questions: 2 to 5 questions that address the heart of the issue • Ending questions: summarize what has been said, ask for clarification |
|
|
Term
|
Definition
• Continuum of participant observer o Role may move along continuum during course of study • Observer – no interaction • Observer as participant – minimal interaction • Participant as observer – often are acting as member of group • Participant – fully acting as member of group |
|
|
Term
Participant observation process |
|
Definition
• Observations o Setting o Participants o Events o Acts o Gestures o Dialog |
|
|
Term
|
Definition
• Descriptive o Portray context • Analytic o Write during process of observation o Expand upon later that day • Autobiographical o Your own behaviors & emotions |
|
|
Term
Steps in participant observation |
|
Definition
• Choose the research site • Gain access • Sampling • Collecting data • Analyzing data • Exiting |
|
|
Term
|
Definition
• Interviews are an interaction • Types o Structured o In-depth • Open • Depth probing |
|
|
Term
Types of In-Depth Interviews |
|
Definition
• Life history o Autobiographies geared to understanding social theories • Event centered o Understanding what the researcher can’t directly observe • Situational o Understanding a situation from multiple people’s perspectives |
|
|
Term
Steps to in-depth interviewing |
|
Definition
• Ensure interviewing is appropriate o Clear goals, unable to observe, slightly larger sample needed, modest time constraints • Select interviewees o Seek saturation • Approach interviewees o Approach will vary based in sensitivity level • Motives • Anonymity • Finals say • Money • Logistics |
|
|
Term
|
Definition
• Your research ?s are NOT your interview ?s o Interview ?s are more specific and contextual • Often helpful to start broadly, but concretely o Grand tour question o Descriptive questioning • Solicited narratives • Log-interview • Personal documents • Modify based on pilot testing |
|
|
Term
|
Definition
• May take several sessions • Try to let interviewee select location and time • Be aware of power dynamic o Nonjudgmental o Let people talk o Pay attention o Be sensitive |
|
|
Term
|
Definition
• Examines a single “system of action” instead of individual subjects • Uses multiple techniques to examine the “case” o Observation o Textual analysis o Focus groups/interviewing |
|
|
Term
|
Definition
• Exploratory • Explanatory • Descriptive |
|
|
Term
|
Definition
o Done to understand a given case, typically used as a pilot test |
|
|
Term
|
Definition
o Done to help explain a phenomena after it occurred, often will compare multiple cases |
|
|
Term
|
Definition
o Starts with a theory and examines how a case builds or contradicts that theory. |
|
|
Term
Steps involved in focus group |
|
Definition
• Determine research question • Select the cases and determine data gathering and anaylsis techniques • Prepare to collect the data • Collect the data in the field • Evaluate and analyze the data • Prepare the report |
|
|
Term
Common moderator problems |
|
Definition
Personal bias: Unconscious need to please client Need for consistency |
|
|
Term
Special considerations of moderator |
|
Definition
• Steer discussions toward their experiences • Seek out distractions between the different social media outlets |
|
|
Term
Role of the note taker –verbal |
|
Definition
• Serves to help with transcription o Noteing speaker changes • Note quotes that seem to standout • Note general conclusions or themes • Good idea to assign two or more group members as note takers (quotes!) that back up evidence |
|
|
Term
Role of note taker – physical |
|
Definition
• Same purpose as verbal note takers, but more important if video recorders were not done • Note reactions of other participants o Others’ comments o General nature • Should have at least one note taker focusing on body language, one person in charge of the audio recordings |
|
|
Term
|
Definition
• Comfort of participants is key • Moderator should have seating position that reinforces leadership • Note takers should be present, but should be inconspicuous |
|
|
Term
If jay believes that it’s good to eat vegetables because he’s always eaten vegetables and has always been healthy, what way of knowing would he be using to reach this conclusion? |
|
Definition
• Tradition/tenacity • Authority • Logic/intuition • Scientific method
(A) |
|
|
Term
which of the following major events, or social forces have contributed to the growth of mass media research? |
|
Definition
• Baby boom generation • The development of broadcasting • The public’s demand for sports programming • The need for marketers to keep abreast of the environmental • World war I
(E) |
|
|
Term
internal and external validity are as important for qualitative research as they are fro quanititative research |
|
Definition
|
|
Term
which qualitative method would a researcher be using if he or she collected data by going into the field and observing established groups? |
|
Definition
• Focus groups • Participant observation • In-depth interviewing • Case studies • (b) |
|
|
Term
asking participants to bring in diaries and journals is a technique used in what type of in-depth interview questioning? |
|
Definition
• Descriptive questioning • Solicited narratives • Log-interviews • Personal document interviews • (D) |
|
|
Term
science-being predictive is not the main goal of science, |
|
Definition
but also exploratory etc is also a goal |
|
|
Term
|
Definition
• start with hypotheses then go through process |
|
|
Term
|
Definition
• start with observations then make a hypotheses (more qualitative) |
|
|
Term
|
Definition
• Most people in academics do – looking at testing a theories about how an individual conforms |
|
|
Term
|
Definition
• People with careers do • Cool hunting (applying basic research) • Big distinction between the two – applied are doing it for an organization or a company – don’t feed it back into the wheel of science and don’t’ want competitors to find out, that the company paid for |
|
|
Term
|
Definition
• A working definition of what the concept means for purposes of investigation |
|
|
Term
|
Definition
How exactly the concept will be measured in a study |
|
|
Term
|
Definition
• # of times that characters are shown punching, hitting and shooting |
|
|
Term
|
Definition
• the processing of translation, vague, abstract construct into something that can be measured • how do we know if we’ve done a good job? |
|
|
Term
|
Definition
• Are you measuring who you think are measured |
|
|
Term
|
Definition
• Can you measure constantly |
|
|
Term
Types of internal validity |
|
Definition
• Face validity • Content validity Criterion/predicitive validity: Construct validity |
|
|
Term
|
Definition
o The measure seems to look good on the face fit |
|
|
Term
|
Definition
o The measure captures the full range meanings/dimensions of the concept |
|
|
Term
Criterion/predicitive validity: |
|
Definition
• Measure is shown to predict score on and some other relevant future measure o Ex. IQ – GPA college |
|
|
Term
|
Definition
• Measure is shown to be related to other concepts that should be related (and not tones that shouldn’t) • Known- group method- we know there is a group that should score more than others • Theoretical expectation |
|
|
Term
|
Definition
• Does your measure provide consistent results • Stability: test-retest |
|
|
Term
|
Definition
• Split-half and cronbach’s alpha o Equivalency o Relate o Variables • Independent variable (IV) Cause + predictor (comes first) • Dependent variable (DV): Effect + outcome |
|
|
Term
|
Definition
o Variable measured merely with different categories o Categories must be o Classrank example o Mutually exclusive: ony one o Exhaustive: why we have other options available o All possible options available |
|
|
Term
|
Definition
o Measured with rank-order categories o Tells you placement but not how far apart o Ex 1,2,3 but do not know time o Rank your major news source o 2 TV, 1 int, 3 np, 4 other o how much TV did you watch yesterday? o 1 (none) o 2 (a little) o 3 (a lot) o interval o measurement points with equal distance between o no true zero |
|
|
Term
Discrete vs. Continuous Variables |
|
Definition
• Discrete= normal • Continuous= interval, ratio • Most communications scholars include ordinal • Fine for puposes of this class, but not bad idea scarcastically |
|
|
Term
|
Definition
• Agreement • Strongly agree to strongly disagree • Frequency • Never to always • Satisfaction • Completely satisfied to completely dissatisfied |
|
|
Term
|
Definition
• Relationship between measure and reality • Often not very good in social science |
|
|
Term
|
Definition
What is it? • Describes how you are selecting your subjects to study • Representative or probability based: used to generalized to a larger population • Non-representative or non probability • Used when you are looking for a specific characteristic (equal data) |
|
|
Term
|
Definition
• The entire group of people you are interested in learning about • Importance to clearly define • All people over 18 living in the US • All citizens over the age of 18 • All registered voters with phones • All registered voters who are likely to vote |
|
|
Term
|
Definition
• When the entire population is measured • Representative sampling: • First most define the population you want to generalize to • All members of the population, must have an equal chance of being selected/include • Elements of that population are randomly selected o Sampling units include o Individuals (ie households) o Groups (ie couples) o Units (ie organizations) o Artifacts (ie TV shows) |
|
|
Term
|
Definition
• Obtain list of all population members • Assign #s to all members • Randomly select # until desired sample size is reached • # selected from a table of random # • letting computer tell us |
|
|
Term
|
Definition
• Start with a phone # ina given area code • Ie 806 • Obtain all given in the area code • Ie (785) (744)(745) • Generate random end # until sample is reached • Start with a known # then add… |
|
|
Term
|
Definition
• Obtain list of all populatin members • Assign # to all members • Randomly select start position in the list • Select every “Kth” element from the list o Kth = # of pop/sample size that is the Kth element you will need to use |
|
|
Term
Multi-stage cluster sampling |
|
Definition
• Randomly select clusters • Randomly select participants from within the clusters • Use this when subjects aren’t listed as individuals • You can combine this with stratified sampling o Ex all counties in USA o Ex college of mass comm. |
|
|
Term
|
Definition
• Your sample will never be a perfect representation of the populatin because of chance o Ie flip a coin 10 times= unlikely get 5 heads + 5 tails • Match population will increase with sample size, but you tend to have diminishing returns |
|
|
Term
|
Definition
• Statistical estimate of the amount your results will vary from the population |
|
|
Term
|
Definition
• How certain you are that the population …falls within your result given margin of error o How wrong you are,… |
|
|
Term
|
Definition
• individuals volunteer to be included o extra credit o try out a product before it is released o discounts o highly involved participants |
|
|
Term
|
Definition
• research samples with a certain number of subjects in various categories o i.e. 100 males, 100 females |
|
|
Term
|
Definition
• the researcher approaches one subject, asks that subject to suggest others, and it continues… • useful for hard to reach subjects, to understand social networks |
|
|
Term
|
Definition
• Professor evaluations • Public opinion surveys • Audience response analysis • Product/service satisfaction • Any time you systematically ask people about their attitudes, emotions, beliefs, knowledge, intentions, or behaviors |
|
|
Term
|
Definition
• Identify/describe attitudes or behaviors (in a given population) • Examine relationships between variables measured o Does x relate to y? o Do a, b, & c together predict Y? |
|
|
Term
• Open-ended survey questions |
|
Definition
o Participants generate responses • More detail • Allows for unforeseeable responses • Time-intensive |
|
|
Term
• Close-ended survey questions |
|
Definition
o Participants close from provided responses • Uniform responses allow for easy comparability, quick analysis • May force respondents to chose a response they do not completely agree with |
|
|
Term
|
Definition
• Check all that apply • Multiple-choice • Semantic differential scales • Likert scales • Rank ordering |
|
|
Term
General guidelines for writing survey questions |
|
Definition
• Be clear • Keep questions short • Avoid double negatives • Avoid double-barreled questions • Avoid leading questions • Avoid asking about issues beyond participants capability to answer • Avoid false premises • Avoid asking embarrassing questions unless necessary |
|
|
Term
|
Definition
• Social desirability bias • To overcome, provide “face-saving”options |
|
|
Term
How we perceive ourselves influences… |
|
Definition
• In low frequency scale people responded that they were less pleased with how they spent free time • in high frequency scale people reported being more satisfied with how they spend their free time |
|
|
Term
|
Definition
• advantages o cost length o no interviewer influence o can have visual content • disadvantages o must be self-explanatory – can’t clarify questions o low response rate o long data collection window |
|
|
Term
|
Definition
• select sample • create questionnaire • write cover letter • create the package • distribute the survey • monitor rates • follow-ups? • Analyze |
|
|
Term
|
Definition
• Advantages o Quick data collection window o Moderate cost o Ability to clarify questions o Moderate response rate (once you get someone to answer) • Disadvantages o Moderate interviewer bias o Short questionnaire o No visual depictions o Cell phones? o Push polls |
|
|
Term
|
Definition
• Controversial tactic o Pretends to be a phone survey. Real goal is: • Campaign for a candidate/idea/proposition • Sell a product • Statements are often not based on fact, or are very generous interpretation of fact |
|
|
Term
Telephone Survey: Process |
|
Definition
• Select a sample • Construct the questionnaire • Prepare interviewer instructions • Train interviewer • Collect data • Make callbacks • Verify results • Analyze data |
|
|
Term
|
Definition
• Advantages o Cheapest o Quickest o No interviewer bias o Moderate response rate o Can have audio-visual content • Disadvantages o Hard to get a respresentative sample o “big brother” concerns • must be self-explanatory • length |
|
|
Term
|
Definition
• Select a sample • Construct the questionnaire • Write cover letter • Program survey • Collect data • Follow up? • Analyze data |
|
|
Term
|
Definition
• Advantages o Clarification possible o Can have audio-visual content o Highest response rate o Length • Disadvantages o Interviewer bias o Expensive o Longer collection window |
|
|
Term
Face to face survey: process |
|
Definition
• Select a sample • Construct the questionnaire Train interviewer • Collect data • Make callbacks • Verify results • Analyze data |
|
|
Term
|
Definition
• Coders should agree in what they see. If not. That’s a problem • To ensure that the coding scheme is reliable we have to test it o Coders score identical content o Results are compared using statistical tests for reliability o Intercoder reliability vs. intracoder reliability |
|
|