Term
|
Definition
A change in academic of behavior as a result of intervention and instruction |
|
|
Term
|
Definition
Identify at Risk Students 1. prevention approach 2. ecological perspective 3. utilizes a problem solving method |
|
|
Term
|
Definition
Lowering the rate of a disorder in a population over a period of time by counteracting harmful effects before they have an opportunity to produce this disorder |
|
|
Term
Traditional Discrepancy Model |
|
Definition
Disability is viewed as a deficit that resides within the individual, the severity of which might be influenced, but not created by contextual variables |
|
|
Term
|
Definition
2 reasons for low achievement 1. instructional concerns 2. disability |
|
|
Term
|
Definition
effectively teach all students early remediation problem solving in multi tiered model employ valid interventions regular PM DBDM Assess for -screening -ID of problem -PM |
|
|
Term
|
Definition
Screening Intervention PM DBDM |
|
|
Term
|
Definition
Treatment Validity Time to Assistance Lvl of support Data based approach to Problem Solving |
|
|
Term
|
Definition
Not the "wait to fail Model"
Discrepancy 1) student is performing significantly below that of their peers and what is expected at their grade ---Dis.= initial skills or performance Discrepancy 2) despite implementation of interventions, the student fails to "close the Gap" with class mates ---Dis2)= dis. in rate of learning/time |
|
|
Term
Steps of RTI for an Individual Case |
|
Definition
1. estimate the gap 2. determing the likely reason for performance 3. select an intervention 4.PM frequently 5. if the student fails to respond to several interventions, consider SPed placement |
|
|
Term
What do schools have to do differently under RTI? |
|
Definition
1. structured format for Problem Solving 2. knowledge of vaious interventions 3. ability to use different methods of assessment to PM |
|
|
Term
|
Definition
1. CBM's 2. Daily Behavior Report Cards 3. Functional Behavioral Assessment |
|
|
Term
Deno's problem Solving Method |
|
Definition
1.ID the problem 2.define the problem 3.design intervention 4.implement intervention 5.problem solution |
|
|
Term
Problem Solving Method Step 1 |
|
Definition
a. list problem behaviors and prioritize b. collect baseline data on primary concern c. state discrepancy between student performance and expected performance -cum. file review - parent int. -teacher int -classroom observation -gap analysis -student int. |
|
|
Term
Problem Solving Step 2 (problem analysis) why is the problem happening? |
|
Definition
1. collect additional RIOT data -differentiate between cant do wont do -determine situations, beh. is most likely/least likely to occur -generate hypothesis for why a problem is occurng
2.narrow down to most valid and alterable hypothesis |
|
|
Term
Problem Solving Step 2 b. (plan development) |
|
Definition
1. What is the goal? -write goal in measurable terms 2. what is the intervention plan to address the goal - define logistics( who, what, when, where) -select an intervention 3. how will progress monitor? -define logistics -define decision making rules |
|
|
Term
Problem solving step 3- Plan implementation (how will implementation integrity be measured) |
|
Definition
1. communicate clear plan 2. provide training and support 3. observe intervention 4. provide method for PM 5. make adjustments to plan 6. collect and graph data 7. ensure fidelity |
|
|
Term
Problem Solving step 4 plan evaluation |
|
Definition
1. use data to determine progress 2. evaluate intervention 3. determine what to do next 4. decision rules 5 determine what to do next |
|
|
Term
Decision Rules to be considered in problem solvng steps |
|
Definition
- 3 con DP above aimline= increase goal -3 con dp below aimeline= change int. -3 con dp on the aimline= maintain int -trendline flatter than GL= change it -TL steeper than GL= increase goal TL- goal line= maintain int |
|
|
Term
School Psychology and RTI Tier 1 |
|
Definition
1. Curriculum Committees -research based methods 2. consultation with teachers - phases of instruction -help insure that Quality instruction occurs 3. consultation with admin - assessment system to collect data -DBDM --- interpret scores --- deriving criteria for ID students at risk |
|
|
Term
School psych and RTI tier 2 |
|
Definition
1. DBDM 2. Matching assessment to skills for int. planning 3. consulting with teach. and admin about potential int. 4. familiar with research |
|
|
Term
School psych and RTI tier 3 |
|
Definition
1.knowledgable about Prob Sol 2. knowledgeable about interventions 3. research based approach 4. brief exp. analysis |
|
|
Term
School psych and RTI across the tiers |
|
Definition
implementation intergrity
collaboration |
|
|
Term
RTI and the effect on School Psychology |
|
Definition
Inc. hours engaged on int. and prob solv
dec. hours engaged in assessment -most assessment now consists of behavioral observations |
|
|
Term
At what point does RTI assessment occur? |
|
Definition
Everything we do in RTI= assessment - screening -PM -norming and benchmarking |
|
|
Term
|
Definition
-RTI is considered an int. based assessment - when results from interventions are used to make decisions, interventions become assessments --DBDM --Student Response informs future actions |
|
|
Term
RTI and Specific Learning Dis. |
|
Definition
-When interventions are used as assessments the results assist in accomplishing 2 goals. 1. inform further intervention on program efforts ---keep doing what works 2. provide evidence of the presence or absence of SLD |
|
|
Term
Effective Instruction and Assessment
BASIC COMPONENTS OF EDUCATION (CIA) |
|
Definition
CURRICULUM- WHAT WE TEACH - skills, concepts, and apps INSTRUCTION- HOW WE TEACH - how the info is conveyed - the goal of the lesson should be considered ASSESSMENT- IS WHAT WE ARE DOING WORKING? - (CBA, CBM, FCAT etc) -assessment MUST BE MATCHED TO INSTRUCTION -choose measures according to info bein sought |
|
|
Term
Evidence of efficacy within curriculum |
|
Definition
-does the program work? - curricula must be tested carefully -What works clearing house --study design --homegeneity of groups -----meets standards -----meets w/o res. -----does not meet standards |
|
|
Term
Teaching with fidelity within curriculum |
|
Definition
Teaching with accuracy -teaching the program the way intended -essential for effective curriculum -following the core as it is outlined ensures students are learning - changing an ingredient in the core may alter the outcome and affect student performance |
|
|
Term
|
Definition
-direct, straighfoward and clear --telling the students exactly what the skill is and how to apply it
Key components 1.break materials down into small steps 2. objectives clearly stated and related to performance 3. provide opportunities to make novel connections 4. practice embedded in lesson 5. students receive additional practice to promote independence 6. feedback is provided after practice |
|
|
Term
|
Definition
Time= one of the most important factors in determining student outcomes and is scarce
PRIORITIZE - time must be alloted to cover all the content |
|
|
Term
|
Definition
1. match instruction to students needs 2. differentiating the content, process, and products to meeting varying student needs if critical to student success |
|
|
Term
opportunities for practice |
|
Definition
1. multiple opportunities for practice are essential - helps students move from instructional level to mastery -should vary in design, format and length
Distributed practice- practice that occurs throughout the day in short bursts |
|
|
Term
Assessment (four categories of assessment in school) |
|
Definition
1. Screening- is there a problem> 2. diagnostic- what is the problem? 3. PM= is the instruction successful 4. outcome/accountability- how are we doing |
|
|
Term
|
Definition
Shorter practice over man days is better than one lone learning session
forgetting curve |
|
|
Term
KEY FEATURES OF EFFECTIVE INSTRUCTION |
|
Definition
5 FEATURES OF EFFECTIVE INSTRUCTION
1. CONTENT 2.DELIVERY 3.PACE 4. RESPONSES 5. ASSESSMENT |
|
|
Term
Content within effective instruction |
|
Definition
-Substance of what is to be learned --matched with what students are expected to learn in order to progress in the curriculum
-- fits in between the knowledge an or skills that students have already learned and what they will learn form subsequent instruction
follows order of operations=easier to master |
|
|
Term
Delivery within effective instruction |
|
Definition
Methods used to present information and solicit learning 1. explicit description of knowledge and or skills to be learned 2. students given many opportnities to rehearse the new skill 3. all students are activley engaged with the knowledge or skills throughout the lesson |
|
|
Term
Pace within effective instruction |
|
Definition
"Goldilocks"= must be just right 1. intensive massed practice of new material is provided during early stages of learning. -all activities related - lessons have clear, single focus 2. judicious review of prev. learned knowledge and skills is incorporate at reg intervals |
|
|
Term
Responses within effective instruction |
|
Definition
Teachers need to provide immediate feedback in order to max. learning |
|
|
Term
Assessment within effective instruction |
|
Definition
Formative- students are assessed continuosly during the lesson, daily student performance are recorded - helps ID struggling students
Summative- data about student progress is reviewed at least weekly to determine whether instructional changes are needed |
|
|
Term
KEY FEATURES OF EFFECTIVE ASSESSMENT |
|
Definition
Assessment in schools 1. screening 2. diagnostic 3.PM 4. outcome |
|
|
Term
Four key features of effective assessment Define specifically useful conduct |
|
Definition
1. define the target skill or behavior 2. specify the setting where data will be collected 3. use an accurate data recording format 4. conduct careful analyasis and interpretaion |
|
|
Term
Defining the target skill or behavior |
|
Definition
1. Operationally define the desired skill or behavior -topography -frequency -duration intensity |
|
|
Term
|
Definition
The location where the skills or behaviors will be observed and recorded
if more than one setting= code for specific settings |
|
|
Term
|
Definition
Data recording procedures and materials are developed to specify 1. who will collect 2. where will data be recorded 3. when will data be rec. 4. how will data be rec.
materials for rec. need to be accessible |
|
|
Term
analysis and interpretation |
|
Definition
1. once a specified amount of data is collected, the info is reviewed to determine whether the intervention produced the desired outcomes.
2 major aspects of data to consider 1. level- score value obtained by student -- can be sued to ID how an ind.s performance compares to other students 2. slope- rte of progress --reveal how quickly a student is moving toward the goal and is predictive of if the goal will be met in time
DATA are used to revise, inc. dec. or discon. intervention |
|
|
Term
|
Definition
Know why we should screen! |
|
|
Term
Critical skills in literacy |
|
Definition
phonemic awareness oral reading fluency |
|
|
Term
|
Definition
early numeracy skills computation fluency |
|
|
Term
|
Definition
1. all children receive measures of academic competency evaluated at the classroom or system level - how many students are responding to instruction? -is instruction effective? -how many are at risk for failure? -who are the students who need more support?
Screening- controls for variation in skills knowledge and expectations
Multiple assessments are ciritcal -growth over time --fall= health of classrrom --winter= visible progress should be seen |
|
|
Term
Individual application of screening PM @ tier 1 |
|
Definition
Multiple screening can reduce or eliminate false positives
screening data is reviewed as part of the first step of the problem solving process( what is the problem) -consider prior screening --health, vision, hearing, achievement, displine, attendance, benchmark |
|
|
Term
General Education responsibilities for screening |
|
Definition
1. administer school wide screening 2. administer assessments, chart and eval results 3. ID students for further monitoring 4. provide info to parents |
|
|
Term
Specialist and support staff responsibilities for screening |
|
Definition
1. assist gen ed teacher implenment 2. collect data on screening tool and associated cut points to help inform process 3. collaborate with gened teacher in determination of further screening 4. present students ID'd as at risk during screening to school teams as candidates for more intensive PM |
|
|
Term
Admin and MTSS team in screening |
|
Definition
1. create infrastructure for screening 2. provide professional development 3.ensure fidelity of implementation 4. research the avalaibility of screening 5. determine if performance warrants intervention 6. provide data from screening to teachers and personel |
|
|
Term
Fundamental assumptions of screening |
|
Definition
1. student achievement is aligned with --curriculum -- instruction --assessment 2. problems are contextually defined 3. culture determines what to screen 4. direct measures of student performance are more reliable pred. of student need 5. school psychologists need to understand screening will help shape their work |
|
|
Term
Contextually defined problems |
|
Definition
1. low performers display a mismatch between they behavior they exhibit and what the system demands ---- low achievement is in relation to students in that context 2. disability- defined in the context of a school system rather than in the context of some inferred trait within the student 3. what implications does this have for ID. -- students can be "cured" of a disability by changing schools --students can become "disabled" by moving to a high achieving school |
|
|
Term
Cultural imperatives vs. Cultural electives |
|
Definition
1. emotion/behavioral health+ proficiency in reading, writing, and math= pursuit of happiness 2. schools are only expected to screen for cultural imperatives -value judgement needs to be made when deciding what skills to universally assess
Cultural elective -art -sports etc. |
|
|
Term
Direct Measures of PErformance |
|
Definition
1. direct measures are more reliable than inferential measures of performance --Inferential-IQ, etc
2. Screening focuses on academic and behavioral indicators linked to curriculum |
|
|
Term
General Steps of screening |
|
Definition
1. define the domain 2. determine criteria for acceptable perfomance and select instrument of approach for data collection 3. gather and organize resources 4. implement screening activties 5. examine data for discrepancy 6. communicate and determine future actions |
|
|
Term
|
Definition
1. what areas should be screened 2. what info is needed to determine is a student is at risk for failure 3. literacy 4. mathematics 5. writing 6. behavior |
|
|
Term
|
Definition
1. phonemic awareness - initial sound fluency - phoneme segmentation
2. Alphabetics - letter naming fluency -letter sound fluency -nonsense word fluency -word ID fluency
3. Oral reading fluency -research indicates measure of general reading competence; predictive or performance on large scale assessment
4. reading comprehension - CBM= Mze |
|
|
Term
|
Definition
1. Early Math Fluency - quanitity discrmination -missing number -number ID -oral counting
2. math computation 3. Math concepts and APPS |
|
|
Term
|
Definition
1. alphabet writing fluency 2. spelling 3. written expression --story starters |
|
|
Term
|
Definition
1. PBS= exemplar for universal screening system of behavior 2. school based data - discipline referrals -attendance records -observations
3. published tools -behavior rating scales -observations for screening |
|
|
Term
Determine approach and criteria for acceptable performance |
|
Definition
1. does assessment align with curriculum 2. are the screening tools valid and reliable? 3. criteria is used to determine discrepancy -should over identify
Over Identification- least dangerous assumption |
|
|
Term
over identification least dangerous assumption |
|
Definition
Sensitivity- probability that the screening tool reliably IDs students at risk
Specificity- probability that the tool does not incorrectly ID those who are not at risk. |
|
|
Term
|
Definition
1. efficient 2. aligned with curriculum 3. technical adequacy 4. use standard proceudres 5. provide objections, observable, low inference, info about student performance 6. criterion referenced - facilitates decision making -"lvl of performance" rather than "trend of performance" decision |
|
|
Term
|
Definition
1. common complaint from school personnel 2. critical skill 3. logical targets for evaluation learning outcomes -ability to complete academic tasks is highly related to mastery of future skills |
|
|
Term
Why not Standardize achievement tests? |
|
Definition
1. not easily administered 2. admin, scoring, interpreting, takes time 3. such measures are not easily repeated during the course of the year 4.not appropriate for universal screening |
|
|
Term
Gather and organize resources |
|
Definition
1.time 2.personel 3.training 4.materials 5. scoring 6. system for inputting data for analysis 7. plan |
|
|
Term
implement screening activities |
|
Definition
1. implement with fidelity 2.stick to schedule 3.assess student absences 5. have a system for screening new students 6. score 7. input data 8 generate graphs |
|
|
Term
|
Definition
Computer software or by hand |
|
|
Term
|
Definition
1. visual analysis of data. 2. large group problem? 3. small group problem? 5. individual problem? |
|
|
Term
Communicate and Determine future actions |
|
Definition
1. parents, teachers, etc. 2. explain implications and next steps MTSS should be able to answer -how the student is responding -how many students are at risk -is the core instruction effective -which students need additional assessment -what levels of resources might be needed to promote criterion level performance |
|
|
Term
Be prepared to discuss and communicate with the MTSS team, parents, teachers, et. |
|
Definition
1. typical growth rates relative to all students 2. data by gender, race, poverty, lang. etc. 3. grade lvl performance graphs 4. median scores for other classes 5. ind student graphs 6. any patterns in the data 7 how intervention can be efficiently provided |
|
|
Term
What if data indicates a school wide problem? |
|
Definition
School personel should agree that if data indicate deificitis for large groups of students, then the district, admin, and school board should be willing to engage in conversations about changing or enhancing core programmin and general instructional practices. Current structures for referring students for additional support and the current mechanisms for delivering supplemental instruction may need to change |
|
|
Term
Unacceptable practices in screening |
|
Definition
-using screening data alone for making entitlement decisions -collecting screening data without a targeted purpose - pursuing a problem individually that is common for a group - engaging in procedures targeted toward determining why a problem is occurring as screening acivity --- screening should be quick and efficient ---- screening should be able to be used on large numbers of students --- |
|
|
Term
|
Definition
CBM's are a form of classroom assessment - describe academic competence -track academic development -improve student achievement |
|
|
Term
|
Definition
Research indicates -CBMS produce accurate, meaningful information about students acadmic lvls and growth
CBM's are sensitive to student improvement
when teachers use CBMS to inform their instructional decisions, students achieve higher. |
|
|
Term
|
Definition
-standardized -compare results to the norm -----national -----local -----criterion |
|
|
Term
|
Definition
1. determined by the purpose ofthe CBM and what the data will be used for 2. look at state and educational standards 3. review the school's adopted curriculum 4. review other relevant data to the student |
|
|
Term
|
Definition
1. ID the skills in the yr. long curriculum 2. determine the weight of skills in the curriculum 3. create alternate forms 4. give tests at beginning,middle,end of yr. 5. graph and analyze data 6. modify instruction as appropriate |
|
|
Term
Tier 1 and CBMS- Screening |
|
Definition
1. ID skills in yr long curriculum 2. determine weight of skills in curriculum 3. create 30 alternate forms ---each test samples the entire years curriculum ---each test contains the same types of problems 4. give tests weekly (2x for Sped) 5. graph and analyze data 6. modify instruction as appropriate |
|
|
Term
CBM for tier 2 and tier 3 |
|
Definition
1. conduct error analysis of the screener 2. conduct survey level assessment -- use more narrow skill for CBM to measure current levels to inform int. planning ------generate timed worksheet for 1 skill --------begin with lowest order skill ------dont ignore skills below current grade lvel |
|
|
Term
2 types of EBI's in schools |
|
Definition
1. academic 2. emotional, social, behavioral - the presence or absnce of behaviors or conditions that interfere with learning and academic achievent
-develop attitudes that are important underpinning of effective social functioning for carer success |
|
|
Term
What constitutes evidence? |
|
Definition
1. sound experimental design 2. problem solving method |
|
|
Term
|
Definition
1. is there a problem? What is it? - clearly defined problem statements 2. what is the discrepancy between what is expected and what is occuring - students ID'd at risk for a reason -gather data to determine gap -validate the problem is actually a problem --needs to pass the so what test |
|
|
Term
Big ideas about Problem ID |
|
Definition
1. not every problem is a problem that warrants problem solving 2. Educational need (performance discrepancy) and educaitonal benefit (rate of improvement) Referral driven PI is acceptable but not sufficient |
|
|
Term
Step 1: Problem ID Under EBI |
|
Definition
1. collect various data from multiple sources and multiple domains 2. list problems and prioritize --determine relvant dimensions of the problem -- tackle one problem at a time 3. state the discrepancy --validates the problem |
|
|
Term
Using survey level assessment to ID the probem |
|
Definition
1. survey level assessment - documents the severity and pinpoints actual skill lvel - can make statements about the students ability -specification of an instructional level can be used to pla intervention and set realistic goals |
|
|
Term
|
Definition
1. multiple probes are generated at the students grade lvl and below grade lvl. 2. admin grade level material to student 3. if student falls below instructional standard, admin next lowest grade 4 administer until the student performance score is found to be in the instructional range for a particular grade level
instructional- challenging but makes progress
frustration- too difficult
master- no sufficient challenge |
|
|
Term
|
Definition
Definition should include - description of students performance in comparison to peers or est. standards -indicators of severity -describe behavior in observable, measureable terms -measure of the desired goal behavior ---baseline, goal line etc -academic problems -- CBM results |
|
|
Term
Step 2: Analyze the problem EBI's problem anaylsis why is it happening? |
|
Definition
1. RIOT 2. determine skill vs. performance 3. detemine when most/least likely to occur 4. generate hypothesis 5. collect data to validate or refute hypothesis 6. select most validated hypothesis |
|
|
Term
|
Definition
1. Haring and Eaton- determine level at which performance is deficient
Four Stages of Learning 1. acquisition- skill accuracy 2. fluency- know skill enough to retain or combine with other skills as fluent as peers 3. generalization- uses skills across settings and situation 4. adaptation- continuous |
|
|
Term
Step 2b- plan development
What shall we do about it? |
|
Definition
1. identify intervention strategies for setting the stage, teaching, and motivating and define intervention logistics
2. identify implementation logistics 3. identify PM logisitics 4. write the goal 5. decide on decision making rules |
|
|
Term
Selecting interventions quickly
The reasonable hypothesis |
|
Definition
We are looking for the reason the child is not learning or behaving 1. test most likely hypothesis first -Ockhams razor- given two competing theoreis the simplest explanation is preferred 2. design an intervention based on this hypothesis, implement the intervention and monitor and evaluate outcomes 3. if this approach fails try something more intensive |
|
|
Term
Identify intervention strategy |
|
Definition
1. evidence based 2. targets single skill what- name and describe intervention
materials- items/time needed for implementing
who- person responsible for implementaion
When, where, how often |
|
|
Term
5 common reasons students fail |
|
Definition
1. academic activity is too hard 2. not enough help to do it 3. not enough time pracicing 4. the student has demomstrated the skill before, but is having difficulty applying the skill in a new manner 5. the student does not want to perform the task |
|
|
Term
|
Definition
1. adequate range of examples to exemplify the concept or Prob Solv Strategy 2. models of proficient performance -step by step strategies 3. experience where students explain how and why they make decisions 4. Frequent Feedback- on quality of performance and support so the student persists in activities 5. adequate practice and activities that are interesting and engaging |
|
|
Term
Learn Units (Heward, 1996) |
|
Definition
Opportunities to respond Active student response Performance feedback |
|
|
Term
|
Definition
- the student is presented with meaningful opportunity to respond to an academic task |
|
|
Term
|
Definition
the student answers an item, solved the problem presented, or completes an academic task. |
|
|
Term
|
Definition
the student receives timely feedback about whether her answer is correct. |
|
|
Term
Included in an intervention plan |
|
Definition
-description of steps -description of needed materials -interventionist -when it will occur -where it will occur -how often it will occur -fidelity measure -goal statement -PM -Decision making rules |
|
|
Term
Select a progress monitoring tool |
|
Definition
-data collection system -data collector -what will be recorded -frequency of data collection -when will data be collected |
|
|
Term
|
Definition
Baseline- repeated measures of the behavior are collected until a stable range of behaviors has been identified --3 data points
things to consider -behavior must be measured directly -accurate, objective measurement related to the dimension of the behavior -same measurement strategy during the int. -measure must be valid and reliable |
|
|
Term
|
Definition
Goal- intended outcome of the intervention, direction or extent to which the target behavior is to be changed
Specify: 1. timeframe 2. condtion 3. behavior 4. criteria |
|
|
Term
Procedures of selecting interventions REAIM |
|
Definition
R- Reach E-Efficacy A-adoption I-Implementation M- Maintenance |
|
|
Term
Steps for selection and implementation of EBI's
Step 1 |
|
Definition
ID the target population an intervention goals -target population --universal --targeted --intensive
Goals -assess needs and problems |
|
|
Term
Steps for Selection and Implementation of EBIs
step 2 |
|
Definition
1. ID appropriate evidence based intervention |
|
|
Term
Steps for Selection and Implementation of EBIs
Step 3 |
|
Definition
Examine the evidence -efficacy -effectiveness |
|
|
Term
Steps for Selection and Implementation of EBIs
Step 4 |
|
Definition
Evaluate evidence based criteria
-Recommendation- effectiveness studies in 2 or more school settings |
|
|
Term
Steps for Selection and Implementation of EBIs
Step 5 |
|
Definition
evaluate evidence quality
1. internal validity -outcome due to intervention? -random assignment or matching? -implemented as descrbed? -steps taken to ensure fidelity? -obtain long term outcomes? 2 external validity -results apply to setting and population youre working with? -did studies occur in similar settings? |
|
|
Term
Steps for Selection and Implementation of EBIs
step 5 cont. |
|
Definition
1. construct validity - measure measure what they say the do? - use of valid and reliable measures? -multiple measures used? 2. conclusion validity -are the results true findings or chance? -sufficient sample size? -does the study show differences in outcomes between int. and control -does study report findings for all outcomes with sig. levels? |
|
|
Term
What is progress monitoring? |
|
Definition
-type of assessment -frequent assessment of skills in order to determine in instruction/interventions are effective -- helps id differences within groups --formative assessment 3. Answers the question- is the intervention/instruction successful? |
|
|
Term
why use progress monitoring? |
|
Definition
1. to see whether a specific intervention is working 2. to show the outcomes of their learning and engage the students in the intervention 3. to provide information about how best to change your instruction, if needed |
|
|
Term
How to use progress monitoring 5 steps |
|
Definition
1. select the progress measure 2. choose the person, time, and place to collect data 3. collect data regularly 4. review the data regularly 5. make instructional changes based on the data |
|
|
Term
Selecting a progress measure |
|
Definition
Progress Measures MUST have 1. Reliability 2. Validity 3. availability in alternate forms 4. sensitivity to student improvement in conjunction with benchmarks 5. demonstrated links to improving student learning and teacher planning 6. specified rates of improvement |
|
|
Term
Collecting Progress Monitoring Data |
|
Definition
Best if collected weekly
Acceptable collection frquencies -Tier 1- 3x per year -Tier 2- at least monthly -Tier 30 at least weekly |
|
|
Term
Reviewing Progress Monitoring Data |
|
Definition
Two ways to make sense of the data 1. slope analysis 3. trend line analysis
Review data to make decisious - maintain program? -inc./dec. program -replace program |
|
|
Term
|
Definition
1. Count how many of the data point are above the baseline. 2. calculate the slope - subtract last baseline score from the ending intervention score -divide by # of weeks= average growth rate per week -multiply the growth rate by # of weeks left for the goal time period to predict when students level of performance will be/if they cont. to grow at this rate |
|
|
Term
|
Definition
Trendline is a prediction of future scores based on the progress data already collected -provides a more precise indicator of whether the student is on track to meet their goal |
|
|
Term
Data Based Decision Making DBDM |
|
Definition
-DBDM= the process of using data to inform instruction decisions.
Used across every step of the problem solving process |
|
|
Term
|
Definition
|
|
Term
|
Definition
1. evaluation that provides the information needed to adjust teaching and learning during the actual instructional process
2. help to ensure students achieve targeted standard-based learning goals within a set timeframe
PM, quizzes |
|
|
Term
|
Definition
1. Evaluation at the end of a treatment/instructional unit to determine program effectiveness
2. measures student learning relative to content standards/ set goals
3. happen too far down the learning path to provide info for instructional adjustment
state assessment, district benchmark assessment, end of unit tests |
|
|
Term
Power of Formative Assessent |
|
Definition
1.Most powerful when students are involved 2. allows students to think critically about their learnign 3 students can act as resources to other students 4. research shows that the involvement and ownership of the work increases student's motivtion to learn |
|
|
Term
|
Definition
When assessment at the classroom level balances formative and Summatrive assessment, a clear picture emerges of where a student is relative to learning targets and standards. Students should be able to share information about their own learning |
|
|
Term
Types of Data Collected for Decision making |
|
Definition
1. Screening Data -benchmark (expected) - Actual Student perfomance
2. intervention/instructional data -test, assignment, quiz grades
3.intervention data yielded from any tier of intervention
4. progress monitoring -frequent checks of progress at every tier |
|
|
Term
Components Required for DBDM |
|
Definition
1. need to be prepared prior to the data meeting - graph of data ----screening data ----gap analysis ----intervention and or instructional data ----progress monitoring data
2. fidelity information -graph or reported percentage ---should be 80% of above
3. observable, meaurable goal 4. identified decision goals |
|
|
Term
|
Definition
1. depending upon the frequency of data collection, student progress may be evaluated as early as several weeks of instruction but may occur following one or 2 months of instruction
2. standard decision rules help teachers determine when instructional changes may be neccessary
3. individual progress monitoring of interventions may incorporate its own decision making framework |
|
|
Term
|
Definition
1. formative evaluation 2. visual analysis- graph interpretation 3.Questions to be answered -did meaningful changes occur? -can the change be attributed to the intevention or instructional program - should adjustments be made |
|
|
Term
Heartland Model
3 areas addressed when evaluating intervention |
|
Definition
1. educational progress 2. discrepancy 3. instructional needs |
|
|
Term
|
Definition
Compared to the projected goal, is the student's progress toward the goal what the team expected? - students rate of skill acquisition vs. typical rates of progress |
|
|
Term
|
Definition
After recieving an intervention the the student further behind, about the same distance behind, or catching up to their peers? - comparison to peers or other standard at 2 points in time ---- before intervention and point of review |
|
|
Term
|
Definition
After receiving an intervention, what resources and ongoing elements of an intervention will be needed in the areas to enhance learning a result in maximum progress? -- determine what elements of the intervention enhance learning and allow the student to participate in general education |
|
|
Term
|
Definition
1. draw trend line of student progress (tukey method) for 7-8 data points and compare to the student's goal line
2. may use "four point rule" if at least three weeks of instruction have occurred and the last 4 scores collected all fall above or below the goal line. |
|
|
Term
|
Definition
Florida's student education 1. Autism spectrum disorder 2. deaf or hearing impaired 3. dual sensory impairment (deaf/blind) 4. emotional/behavior disability 5. gifted 6. homebound or hospitalized 7. intellectual disability 8. speech impairment 9. language impairment 10. other health impairment 11. orthopedic impairment 12. specific learning disability 13. traumatic brain injury 14. visual impairment: blind and partially sighted., |
|
|
Term
|
Definition
1. significantly below average general intellectual and adaptive functioning manifested during the developmental period, with significant delays in academic skills. Developmental period refers to birth to eighteen
2. general education interventions and activities must be provided prior to referral for evaluation
3. Evaluation should consist of - standard IQ test -standardized assessment of adaptive behavior - standardized test of academic or pre-academic achievement -social developmental history |
|
|
Term
|
Definition
1. measured level of intellectual functioning is more than 2 SDS below the mean
2. level of adaptive functioning is more than 2 SD's below the mean on the adaptive behavior composite or on 2 of 3 domains- must include parental guardian input
3. level of academic performance is consistent with performance expected of a student of comparable intellectual functioning
4. social developmental history ID's factors impacting student functioning and documents the students functional skills outside school
5. the student needs special education |
|
|
Term
IND documentation of determination eligibility |
|
Definition
1. written summary of the groups analysis of the data - basis for making the determination
-noted behavior during the observation of the student and the relationship of the behavior to the student's academic and intellectual functioning
-educationally relevant medical findings
-determination of the group concerning the effects on the student's achievement level
- signature of each group member certifying that the documentation reflects the members conclusion |
|
|
Term
Specific Learning Disability |
|
Definition
-Reauthorization of IDEA in 2004
-state must not require the use of a discrepancy between intellectual ability and achievement for determining whether a child has a SLD
- a state can prohibit or make optional the use of a discrepancy model
- a state may permit the use of other alternative research based procedures
-schools canot use any single measurement or assessment as the sole criterion for determining eligibility |
|
|
Term
|
Definition
1. SLD- a disorder in one or more of the basic learning processes involved in understanding or in using language, spoken, or written that may manifest in significant difficulties affecting the ability to listen, speak, read, write, spell of do mathematics
- associated conditions may include dyslexi, dyscalculia, dysgraphia, developmental aphasia
- a specific learning disability does not include learning problems that are primarily the result of a visual, hearing, motor, intellectual or emotional behavioral disability, limited english profiency, or environmental, cultural, or economic factors. |
|
|
Term
SLD general education intervention procedures and activities |
|
Definition
1. in order to ensure that lack of academic progress is not due to a lack of appropriate instruction, a group of qualified perssonel must consider
- data that demonstrae that the student was provided effective instruction and interventions addressing the areas of concern
-data based documentation which was provided to the parents of repeated measures of achievement at reasonable intervals, graphically reflecting the students response to intervention during instruction
- general education activities and interventions conducted prior to referral |
|
|
Term
SLD evaluation requirements |
|
Definition
1. the school district must promptly request consent to conduct an evalation to determine if the student needs specially designed instruction in the following circumstance
-doesnt make adequate progress ---prior to a referral, the student has not made adequate progress after an appropriate amoung of time when provided appropriate instruction, intense ind. interventions or ---- prior to referral, intensive int. are demonstated to be effective but require sustained and substantial effort that may include the provision of specially designed instruction and related services
2. whenever a referral is made to conduct an evaluation to determine the students need for specially designed instruction and the existence of a disability |
|
|
Term
RTI procedures for special education eligibility |
|
Definition
|
|
Term
the dual discrepancy as the key to eligibility determination |
|
Definition
1. Levels of difference 2. Rate Difference |
|
|
Term
|
Definition
large performance differences compared to peers and benchmark expectations in relevant domains of behavior |
|
|
Term
|
Definition
lagre differences in rate or learning compared to peers and trajectories toward benchmark standards when provided high quality intervention over a significant period of time |
|
|
Term
Assessment Vs. Evaluation |
|
Definition
1. Assessment- process of collecting information -need for standardization, reliability, alidity
2. evaluation- process of using information to make decisions -focusing on evaluation helps us consider why we are conducting the assessment |
|
|
Term
Process of determining eligibility |
|
Definition
1. problem solving process -ID the problem ----- parental involment 0----systematic data collection
2. problem analysis 3. intervention development and implementation 4. evaluate the plan |
|
|