Term
|
Definition
TIME FRAME AND KINDS OF IMPLEMENTATION
|
|
|
Term
SOME USEFUL ACRONYMS TO KNOW FOR RESEARCH |
|
Definition
YRBSS: Youth Risk Behavior Surveillance System (CDC)
BRFSS: Behavior Risk Factor Surveillance System (CDC)
YTS: Youth Tobacco Survey (CDC)
NHANES: National Health and Nutrition Examination Survey (CDC)
HaPI: Health and Psycological Instruments
MEDLINE: Medical Journals
ERIC: School Health
HRSA: Health Resources and Services Administration
NIH: National Institute of Health
CINAHL: Health Education, nursing
|
|
|
Term
What is the name of the element and the tool being used?
Clues: One of the elements of the tool is:
Consulting insiders (e.g., leaders, staff, clients, and program funding sources) and outsiders (e.g., skeptics);
Taking special effort to promote the inclusion of less powerful groups or individuals |
|
Definition
Element: Engage Stakeholders
Engaging stakeholders: Those involved, those afforested, primary intended users.
Tool: CDS six step Framework for program Evaluation |
|
|
Term
These are examples of the _______ element of CDC six step framework for program evaluation
Information scope and selection. Information collected should address pertinent questions regarding the program and be responsive to the needs and interests of clients and other specified stakeholders.
Values identification. The perspectives, procedures, and rationale used to interpret the findings should be carefully described so that the bases for value judgments are clear. |
|
Definition
Accuracy standards
Accuracy standards reveal and convey technically accurate information |
|
|
Term
The following are examples of which standard of the CDC six step framework?
Political viability. During planning and conduct of the evaluation, consideration should be given to the varied positions of interest groups so that their cooperation can be obtained and possible attempts by any group to curtail evaluation operations or to bias or misapply the results can be averted or counteracted.
Cost-effectiveness. The evaluation should be efficient and produce valuable information to justify expended resources. |
|
Definition
Feasibility standards
Feasibility standards ensure that an evaluation will be realistic, prudent, diplomatic, and frugal |
|
|
Term
The following are examples of which elements of the CDC six step framework?
Listing specific expectations as goals, objectives, and criteria for success;
Clarifying why program activities are believed to lead to expected changes;
Drawing an explicit logic model to illustrate relationships between program elements and expected changes |
|
Definition
Element: Describe the program
Describing the program: Need, expected effects, activities, resources, stage, context, logic model |
|
|
Term
What kind of research can be characterized by the following examples:
• Identify the tobacco use habits of teenagers • Explaining how parents feel about year round school |
|
Definition
Descriptive research
descriptive research attempts to determine, describe, or identify what is |
|
|
Term
Based on the following information, what evaluation model would be best to use when evaluating this program?
The goal of the program is successfully increase 15 % of obese children’s physical activities
The program is not necessarily focused on reaching the goal of the program, but is more focused on the children understanding the benefits of exercising. |
|
Definition
Goal-free
Goal Free Evaluations are not based on goals; evaluator searches for all outcomes inducing unintended positive and negative side effects. |
|
|
Term
Descriptive data may be classified in 4 different types |
|
Definition
Nominal Ordinal Interval Ratio |
|
|
Term
What are some of the different softwares that are used in collecting and evaluating qualitative analysis? |
|
Definition
|
|
Term
A smoking college campus is going to become a smoke-free campus. What kind of assessments should be done to evaluate the potential health effects before this policy change? |
|
Definition
Health impact assessment (HIA)
Health impact assessments are used to objectively evaluate the potential health effects of a project or policy before it is developed or implemented.
Major steps: Screening to identify projects or policies for which an HIA would be useful Scoping to identify which health effects to consider Assessing risks and benefits which people maybe affected and how they maybe affected Developing recommendations to suggest changes to proposals to promote positive or mitigate adverse health effects Reporting to present the results to decision –makers Evaluating to determine the effect of the HIA on the decision. |
|
|
Term
Seeking answers to the following questions is indicative of what kind of analysis?
• Someone's interpretation of the world • Why they have that point of view • How they came to that view • What they have been doing • How they identify or classify themselves and others in what they say |
|
Definition
Qualitative anylsis
Qualitative analysis comes in some form of explanation, understanding or interpretation of the people and situations we are investigating.
Evaluators examine qualitative data to identify: 1. Patterns, recurring themes, similarities, and differences 2. Ways in which patterns (or lack thereof) help answer evolution questions 3. Deviations from patterns and possible explanation for divergence 4. Interesting or particularly insightful stories 5. Specific language people use to describe phenomena 6. To what extend patterns are supported by past studies or other evolutions ( and if not, what might explain the differences) |
|
|
Term
Participant observation, document study, interviews, and focus groups are examples of what kind of data collection strategies? |
|
Definition
Qualitative
Qualitative methods are descriptive in nature and attempts to discover meaning or interpret why phenomena are occurring. |
|
|
Term
What kind of data shows stats, numbers, graphs, etc. |
|
Definition
Quantitative
Quantitative methods focuses on measuring things related to health education programs through the use of numerical data to help describe, explain or predict phenomena. |
|
|
Term
An investigation aimed at ascertaining the status of a set of variables, such as the number and variety of persons with specific conditions in a specified population, but without any critical analysis or attempt to test casual hypotheses, is known as a ________ study design.
Examples include the U.S. National Health Care Survey, periodic reports from cancer registries, and needs assessment surveys conducted by a local health department. |
|
Definition
Descriptive
Descriptive research attempts to determine, describe, or identify what is |
|
|
Term
Analytical studies are also known as.... |
|
Definition
Cohort or case control studies |
|
|
Term
Descriptive studies are also known as.... |
|
Definition
|
|
Term
A study in which a hypothesis is assumed and where the participating group is studied to understand why they have certain characteristics is indicative of what kid of study design? |
|
Definition
Analytical
also known as Cohort of case control study |
|
|
Term
|
Definition
is focused on the ultimate goal, product or policy
HAPPENS AT THE END OF IMPLEMENTATION STAGE
outcome evaluation looks at WHAT HAPPENED BECAUSE OF WHAT YOU DID
(EX: no texting and driving laws were put into place within the community your program took place) |
|
|
Term
|
Definition
focuses on IMMEDIATE AND OBSERVABLE EFFECTS of a program leading to the desired outcomes
HAPPENS AT THE END OF IMPLEMENTATION STAGE
(EX: because of your program 75% of participants stopped texting while driving) |
|
|
Term
|
Definition
associated with measures or judgments that enable the investigator to draw conclusions
HAPPENS AT THE END OF IMPLEMENTATION STAGE
Summative assessment is characterized as assessment OF learning
(EX at the end of the program participants knew how to identify drivers who were texting and driving) |
|
|
Term
|
Definition
any combination of measures that occur as a program iS implemented to assure or improve the quality of performance or delivery
HAPPENS DURING IMPLEMENTATION STAGE
(EX. Did participants enjoy and learn more from guest speakers or group discussion) |
|
|
Term
|
Definition
looks at an ongoing process of evaluation from planning through implementation
formative assessment is an assessment FOR learning.
HAPPENS AT THE PLANNING AND IMPLEMENTATION STAGES
(EX: Were all reading materials written at appropriate readability levels for the program participants?) |
|
|
Term
|
Definition
are operations forms of a construct. They designate how the construct will be measured in designated scenarios. |
|
|
Term
|
Definition
the degree to which a test or assignment measures what it is intended to measure. Using a valid instrument increases the chance of measuring what was intended |
|
|
Term
|
Definition
refers to the consistency, dependability and stability of the measurement process |
|
|
Term
What are the three methods used by health education specialists to avoid conducting research that has already been done several other times? |
|
Definition
Systemic reviews Meta-Analyses Pooled Analyses |
|
|
Term
|
Definition
published qualitative review of a comprehensive synthesis of publications on particular topics |
|
|
Term
|
Definition
a systematic method of evaluating statistical data based on results of several independent studies of the same problem |
|
|
Term
|
Definition
method for collecting all the individual data from a group of studies, combining them into one large set of data and the analyzing the data as if it came from onE big study |
|
|
Term
The following questions are good questions to ask yourself when doing what?
• Was the purpose of the study stated? • Was the research question or hypothesis stated? • Were the subjects described? Did the literature describe subject recruitment? • Was the study design and location of the study described? • Were the data collection instruments described? • Did the presented results reflect the research question or hypothesis? • Were the conclusions reflective of the research design and data analysis? • Were the implications meaningful to the priority population? |
|
Definition
|
|
Term
Name five general guidelines to creating new data analysis tools such as a survey, questions to ask during a focus group, etc. |
|
Definition
1. avoid long lists of choices 2. avoid abbreviations-especially those that are not very common 3. Items that require yes/no answers (when you are trying to collect qualitative data) 4. Assumptions of participants knowledge, lifestyles, likes, dislikes, etc. 5. probing for certain answers or responses |
|
|
Term
What are the four STANDARDS of the CDC Six Step Framework for Program Evaluation |
|
Definition
Utility Feasibility Propriety Accuracy |
|
|
Term
What are the six STEPS of the CDC Six Step Framework for Program Evaluation |
|
Definition
Engage Stakeholders Describe the Program Focus on Evaluation Design Gather credible evidence Justify conclusions Ensure use and share lessons learned |
|
|
Term
Primary, Secondary, Tertiary, qualitative, quantitative, are all descriptive of different kinds of what? |
|
Definition
|
|
Term
An example of descriptive Nominal Data would be everyone who has blue eyes. Why? |
|
Definition
Because nominal data is categorical but has no particular order |
|
|
Term
An example of descriptive ordinal data would be everyone rating something 1-5, with five best and 1 being that worst. Why? |
|
Definition
Because ordinal data is categorical with a logical order. |
|
|
Term
An example of descriptive interval data would be the number of students in a class. Why? |
|
Definition
Because interval data is categorical with only whole numbers as a possible answer. |
|
|
Term
An example of descriptive ratio data would be BMI. Why? |
|
Definition
Because ratio data is categorial with decimal and fraction numbers as possible answers. |
|
|