Supporting Students with
Additional Needs in an RTI System
Jon Potter, Ph.D.
Lisa Bates, Ph.D.
David Putnam, Ph.D.
Oregon RTI Project
OSPA Conference, Fall 2012
Afternoon Targets
Tier 2/3: Using data to place students
in interventions (literacy) & evaluating
intervention effectiveness
Tier 3: Individual Problem Solving
What is your role in ensuring the right students
receive the right support at the right time?
School Psychologists’ Role
“RTI calls for early identification of learning and
behavioral needs, close collaboration among
classroom teachers and special education
personnel and parents, and a systemic
commitment to locating and employing the
necessary resources to ensure that students
make progress in the general education
curriculum.” - NASP School Psych Role and RTI Fact Sheet
Assessment
Consultation
Program
Evaluation
Using screening data to match
interventions to student need
(Literacy)
Which students receive
interventions?
• Schoolwide/Districtwide decision rules
should determine which students will
receive additional support
– Based on schoolwide screening data (DIBELS,
easyCBM, AIMSWEB, etc)
– Based on available resources and system
capacity
• Lowest 20%? 30%?
• All student well below benchmark?
Assessment
easyCBM
Decision Rules
guide placement
in interventions
Lowest 25%
Lowest 20%
All High Risk
DIBELS Next
Lowest 20%
Lowest 25%
All below and well
below benchmark
60 2nd Grade Students
Linking Assessment to
Intervention
Screening
Data
Intervention
Program
Instructional
need
Some will need more
Oral Reading
Fluency &
Accuracy
Phonemic
Awareness
Reading
Comp
Vocabulary
Phonics
(Alphabetic
Principle)
Logistics
• When do these type of discussions
typically take place?
– Initial intervention placement meetings after
schoolwide screenings – 3x year
– May also discuss every 6-8 weeks when
reviewing student progress.
Consultation
Ensuring an Instructional Match
Question 1: What is the skill deficit?
Question 2: How big is that deficit?
Question 3: What interventions address that
deficit?
Question 4: How do we implement the program?
Question 1: What is the skill deficit?
The Big 5 of Reading
Reading Comprehension
Phonics
(Alphabetic Principle)
Phonemic Awareness
Vocabulary
Oral Reading
Fluency & Accuracy
Assessment
DIBELS Next easyCBM*
AIMSWEB
•MC
•RTF
Reading
•Daze
•ORF CWPM Comp
•Maze
•Reading CBM
Common Screening Data Sources
Reading Comprehension
•Reading CBM
•ORF Acc %
•NWF WWR
•NWF CLS
•PRF Acc %
•Letter
Sounds
•R-CBM Acc %
•NWF
•LSF
Phonics
(Alphabetic Principle)
•PSF
•FSF
•Phoneme
Segmenting
•Phoneme
Segmentation
Phonemic Awareness
DIBELS Next easyCBM*
AIMSWEB
Vocabulary
Oral Reading
Fluency & Accuracy
•ORF CWPM •PRF
•ORF Acc % •WRF
*easyCBM includes a
Vocabulary measure
CBM measures are linked to
the Big 5 of Reading
Assessment
Vocabulary
Phonemic
Awareness
Phonics
(Alphabetic
Principle)
Oral Reading
Fluency &
Accuracy
Reading
Comprehension
DIBELS Next Class List Report (2nd Grade – Fall)
Vocabulary
Phonemic
Awareness
Phonics
(Alphabetic
Principle)
Oral Reading
Fluency &
Accuracy
Reading
Comprehension
easyCBM Class List Report (2nd Grade – Fall)
The Big 5 of Reading
Reading Comprehension
Phonics
(Alphabetic Principle)
Phonemic Awareness
Vocabulary
Oral Reading
Fluency & Accuracy
How skills build on each other
• Activity:
– Oral Reading Fluency Assessment
• Find a partner
– Partner 1 (person with next Birthday) – Reader
– Partner 2 – Test Administrator
• Administer the reading assessment, and
have the reader answer the questions
19
Phonics and accuracy are important
Words missed per page when accuracy is…
95%
98%
99%
18.5
7.4
3.6
My Brother Sam is Dead 15
5-6th grade
6
3
The Magic School Bus
2nd – 3rd grade
2.4
1.2
The Secret Life of Bees
7th Grade
6
Accuracy is more important than fluency
Accurate at
Skill
Fluent at
Skill
Able to
Apply Skill
IF no, teach
skill.
If yes, move
to fluency
If no, teach
fluency/
automaticity
If yes, move
to
application
If no, teach
application
If yes, the
move to
higher level
skill/concept
Adapted from
The Big 5 of Reading
Reading Comprehension
Phonics
(Alphabetic Principle)
Application
Fluency
Accuracy
Phonemic Awareness
Vocabulary
Oral Reading
Fluency & Accuracy
Phonics Example:
Nonsense Word Fluency
Accurate at
Skill
Fluent at
Skill
Able to
Apply Skill
Student
knows all
letter sounds
and makes
few, if any,
mistakes
Student
knows all
letter sounds
AND provides
letter sounds
fluently
Student
automatically
blends letter
sounds into
whole words
Accuracy
Fluency
Application
7
0
7
0
9
0
8
0
4
0
35
0
35/56 letter
sounds correct
= 63%
Accuracy
Fluency
Application
14
0
14
0
7
0
35/36 letter
sounds correct
= 97%
35
0
Accuracy
54/54 letter
sounds correct
= 100%
Fluency
Application
14
5
14
5
15
5
14
5
11
4
68
24
Validating the deficit
• CBM measures (DIBELS, easyCBM,
AIMSWEB, etc) are “indicators”
• What does your other data tell you?
– In-curriculum assessments
– Other CBM data
– OAKS
Assessment
Question 2: How big is that deficit?
Is the skill low or significantly low?
• You must define what is low and what is
significantly low:
Examples:
DIBELS Next
easyCBM*
AIMSWEB**
Low
Significantly low
…as
compared
to a Research-Based
Standard
Below
benchmark
Well below Benchmark
Between 11th and 20th
≤10thStudents
Percentile
…as
compared
to
Other
percentile
th and 25 to Other Students or
Between
…as11
compared
≤10th Percentile
percentilea Standard you set
*easyCBM default percentile rank settings
**AIMSWEB default percentile rank settings
Question 3: What interventions
address that deficit?
Program
Evaluation
What intervention programs does your
school have that address the skill need(s)?
What intervention programs does your school
have that address the skill need(s)?
Phonemic
Awareness
Triumphs

Phonics for
Reading
Phonics


Oral Reading
Accuracy &
Fluency

Vocab


STARS
Reading
Mastery





Language for
Thinking
Horizons


Read Naturally
SFA Tutoring
Reading
Comp








Phonemic
Awareness
Triumphs

Phonics for
Reading
Phonics


Oral Reading
Accuracy &
Fluency

Vocab


STARS
Reading
Mastery





Language for
Thinking
Horizons


Read Naturally
SFA Tutoring
Reading
Comp








Phonemic
Awareness
Triumphs

Phonics for
Reading
Phonics


Oral Reading
Accuracy &
Fluency

Vocab


STARS
Reading
Mastery





Language for
Thinking
Horizons


Read Naturally
SFA Tutoring
Reading
Comp








Additional resources for
evaluating interventions
• What Works Clearinghouse
– http://ies.ed.gov/ncee/wwc/
• Florida Center for Reading Research
– http://stage.fcrr.org/fcrrreports/CReportsCS.aspx?rep=
supp
• Oregon Reading First
– http://oregonreadingfirst.uoregon.edu/inst_curr_revie
w_si.html
• Best Evidence Encyclopedia
– http://www.bestevidence.org/
Question 4: How do we implement
the program?
Consultation
Placement Tests
Once an intervention
program that addresses
the instructional need is
identified, placement
tests should be used to
form instructional
groups of students.
Other considerations
• Available resources (time, staff, materials)
will guide how many groups are created.
• Consider the behavioral and
social/emotional needs of the students
Additional Diagnostic data
• Diagnostic assessment in critical area of
need:
Quick phonics screener
Curriculum-Based
Evaluation
CORE multiple measures
DIBELS booklets error
patterns
Running Records
Other?
7
7
9
8
4
0
0
0
0
0
35
0
With your partner
• What other data sources do you
currently use or are available to you, to
help match interventions to student
need?
– Reading
– Math
– Writing
– Behavior
Documentation
Johnny
Phonics (in text)
X
O O
Oral Reading Fluency Quick Phonics Screener Reading Mastery 2
Evaluating Interventions
What’s the Big Idea(s)!?
• Use appropriate progress monitoring tools
• Set Goals
• Establish Decision Rules
• Analyze data, apply decision rules and
determine what to change
Progress Monitoring Tools
Brief &
Easy
Sensitive
to growth
Frequent
Equivalent
forms!!!
What are some commonly used progress
monitoring tools?
Reading
AIMSWEB
Reading CBM, Maze
DIBELS NEXT
FSF, PSF, NWF, ORF, Daze
easyCBM
PSF, LSF, WRF, PRF, MC Reading Comp, Vocab
Math
AIMSWEB
M – Computation, M – Concepts & Applications, CBM –
Early Numeracy
easyCBM
Numbers & Operations, Measurement, Geometry, Algebra
Written Language
Writing – CBM (Total Words Written, Correct Writing Sequences, Words
Spelled Correctly)
What are NOT good progress monitoring
tools?
Reading
•Phonic Screeners
•Report Cards
•OAKS
•DRA
•Running Records
•Reading curriculum
weekly or monthly tests
or fluency passages
Math
Curriculum weekly tests
OAKS
Teacher created math probes*
Written Language
Writing rubrics*
OAKS
* when not administered and scored in a standardized and reliable way, or checked for
consistency of multiple probes
Do we have the right “indicators”?
• Oral Reading Fluency and Accuracy in
reading connected text is one of the best
indicators of overall reading comprehension
(Fuchs, Fuchs, Hosp, & Jenkins, 2001)
Fluent & accurate reading is not the end goal…
but a child who cannot read fluently and accurately
cannot fully comprehend written text.
Additional Progress Monitoring Tools
For more info and a review of available
tools, visit www.rti4success.org
(Progress Monitoring Tools Chart)
Goal Setting:
Things to Consider
1. What is the goal?
– Criterion-based
•
Research-based benchmarks/proficiency
– Norm-based
•
•
Minimum of 25th percentile (bottom limit of
average)
School, District, State, National
How do you define success?
Goal Setting:
Things to Consider
2. By when will they get there?
– Long term goals always at proficiency
(i.e., grade placement benchmark)
– Short term goals may be an incremental step
towards proficiency (i.e., instructional level
material)
Does your goal close the gap?
Goal Setting:
Things to Consider
3. What does reasonable growth look
like?
– National Growth rates (Fuchs, AIMSWEB,
Hasbrouck & Tindal)
– Local Growth rates
• District, School, Classroom, Intervention Group
What progress can we expect?
National Growth Rates: Reading
Grade
Average ORF
Growth
(WCPM)*
1
2
3
4
5
6
2
1.5
1
0.85
0.5
0.3
Ambitious
Average
ORF Growth Maze Growth
(WCPM)*
(WCR)**
*Fuchs et al (1993), **Fuchs & Fuchs (2004)
3
2
1.5
1.1
0.8
0.65
0.4
0.4
0.4
0.4
0.4
0.4
“Using national normative samples allows
comparisons to be made with the
performance levels expected of typical
performing students from across the
country and equates more closely with data
sets that are used in well developed,
published, norm-referenced tests.”
Shapiro, 2008
Local Growth Rates
What does typical growth look like in…
…your district?
…your school?
…your classroom?
…your intervention group?
“…use of the combination of local and
national norms provides the user of these
data with opportunities to evaluate how
student performance compares with a
national sample of same-grade peers, as
well as against the local peers within the
particular school.”
Shapiro, 2008
Setting Appropriate Goals Is
Important
Oral Reading Fluency
(Words Correct Per Minute)
Benchmark
18 WCPM
36 WCPM
Decision Rules
• Decision rules guide how we decide if our
interventions are working—and when to
move on
• Your decision rules create consistency
across grade levels and schools
• Determine how to intensify and
individualize interventions
• Standardizes process for eligibility
decision making
Key features of decision rules
• Set the grade levels for the decision rules
(K, 1-6)
• Number of points below the aimline
• Give direction if the data is highly variable
– Trendline analysis
• Duration of intervention /frequency of
monitoring (Length of time in between
meetings (6 to 8 weeks)
• Define success
Evaluating Interventions:
Is What We Are Doing Working?
AAA
• Apply Decision Rules: Is the student
making adequate progress based on
decision rules?
• Analyze: Is it an individual or a group
problem?
• Action: Determine what to change
Apply: Is the Student Making
Adequate Progress?
60
50
40
30
Aimline
20
10
Chase
D ec .
S c ores
J an .
S c ores
F eb .
S cores
Marc h
S cores
A p ril
S c ores
May
S c ores
J une
S c ores
60
Analyze: Is it an Individual or a
Group Problem?
Cohort Group Analysis:
Students who have similar literacy
programming:
– Grade level
– Intervention program
– Time
– ELD level
Cohort Data
60
50
40
Isaiah
Aimline
Mary
30
Amy
20
10
Chase
Dec .
S cores
62
J an.
S cores
F eb.
S cores
Marc h
S cores
A pril
S cores
May
S cores
J une
S cores
Cohort Data
60
50
40
Aimline
30
20
Amy
Isaiah
10
Chase
Mary
Dec .
S cores
63
J an.
S cores
F eb.
S cores
Marc h
S cores
A pril
S cores
May
S cores
J une
S cores
Action:
Determine What to Change
• Listen to the data
• Gather additional data if necessary
• Focus on instructional variables that you
can control!
Focus on what we can control
65
What do we change?
Time
Group Size
Time/
Engagement
Different
Program
Individual
ProblemSolving
A Final Thought
It’s better to shoot for the stars and
miss than aim at the gutter and hit it.
– Anonymous
Break Time
68
Individual Problem Solving
OSPA Fall Conference
Oregon RTI Project
October 12th, 2012
69
Targets
• Provide a framework for how to
individually problem-solve students with
the most intensive needs
70
“It is better to know some of
the questions than all of the
answers.”
James Thurber
71
Problem-Solving Non-example
Problem-Solving Non-example
Who are students with the
most intensive needs?
Students with identified disabilities
Students who may have a disability
Students with significant literacy deficits
74
If there was a problem…
75
Why proactive problem solving?
“Problem solving assessment typically takes a more
direct approach to the measurement of need than
has been the case in historical special education
practice” -Reschley, Tilly, & Grimes (1999)
“Intervention studies that address the bottom 1025% of the student population may reduce the
number of at-risk students to rates that approximate
2-6%”
-Fletcher, Lyon, Fuchs, & Barnes (2007)
76
The Problem Solving Process
How is it
working?
4. Plan
Implementation
& Evaluation
What are we
going to do
about the
problem?
1. Problem
Identification
Improved
Student
Achievement
3. Plan
Development
What is the
problem?
2. Problem
Analysis
Why is the
problem
occurring?
77
Problem Solving Form
78
Step 1: Problem Identification
1. Problem
Identification
What is the
problem?
Improved
Student
Achievement
79
Step 1: Problem Identification
A problem is defined as a discrepancy between:
Expected performance
Current performance
Problem
Definition
80
Step 1: Problem Identification
• Expected performance is based on data:
– Performance of typical/average peers
– Research-based benchmarks
– Proficiency scores
• Actual performance is based on current
student data
81
Step 1: Problem Identification
• Calculating magnitude of discrepancy
Absolute discrepancy:
Expected performance
72 wcpm (Winter
2nd
Grade)
–
–
Current performance
32 wcpm
Discrepancy Ratio:
Larger Number
72 wcpm (Winter 2nn Grade)
÷
÷
Smaller Number
32 wcpm
=
=
-40
wcpm
2.25
times
discrepant
82
Discrepancy between Current
Performance & Expected Performance
83
Step 1: Problem Identification
Problem Definitions should be:
1. Objective – observable and measurable
2. Clear – passes “the stranger test”
3. Complete – includes examples (and nonexamples when necessary) and baseline
data
84
Problem Definition: Example
Harry (2nd grader) is currently reading a
median of 44 words correct per minute
(wcpm) with 83% accuracy when given
2nd grade level text. He also answers an
average of 3/10 comp questions correct
on weekly in-class tests.
2nd grade students in his school are
reading an average of 85 wcpm with 97%
accuracy on 2nd grade text and
answering 9/10 comp questions correct.
85
Problem Definition: Non-Example
Harry struggles with being a fluent
reader and is not meeting the 2nd grade
reading benchmark. He makes a lot of
mistakes and is currently reading at a 1st
grade level. He also has difficulties
answering comprehension questions at
grade level and does poorly on his
weekly reading tests.
86
Step 1: Problem Identification
• Replacement Skill or Target Behavior
– What would it look like if this student were
successful?
– What would we prefer the student do,
instead of the problem behavior?
87
Problem Definition & Target Skill
88
The Problem Solving Process
1. Problem
Identification
Improved
Student
Achievement
2. Problem
Analysis
Why is the
problem
occurring?
89
Step 2: Problem Analysis
Problem
Identification
Plan
Development
The WHY should always drive
the Problem
WHAT
Analysis
90
The Water…
Focus on “the
water”• Instruction
• Curriculum
• Environment
C
I
91
ICEL
I – Instruction
C – Curriculum
E – Environment
L – Learner
Student Learning
Instruction:
How you teach
Environment:
Where you teach
Curriculum:
What you teach
Learner:
Who you teach
93
We can control the how, what, and where.
We don’t have much control over the who.
94
When it comes to problem
analysis, just remember…
95
ICE, ICE baby
I – Instruction
C – Curriculum
E – Environment
then
L – Learner
96
What impacts student achievement?
Effective teaching
variables
Effect
size
Other variables
Effect
size
Formative Evaluation
+0.90 Socioeconomic Status
+0.57
Comprehensive interventions
for students with LD
+0.77 Parental Involvement
+0.51
Teacher Clarity
+0.75 Computer based instruction* +0.37
+0.74 School Finances
+0.23
Reciprocal Teaching
Aptitude by Treatment
Feedback
+0.73 Interactions*
+0.19
Teacher-Student
Relationships
+0.72 Family Structure
+0.17
Direct Instruction
+0.59 Retention
-0.16
John Hattie, Visible Learning, 2009
Hypothesis Development
Instruction:
Curriculum:
?
Environment:
?
?
Learner:
?
98
ICEL
Assessment
99
Instruction, Curriculum, & Environment
• What should appropriate instruction,
curriculum, and environment look like?
• Video: Early Reading Intervention
– 3 students receiving direct instruction on
phonemic awareness & phonics
– Observe and note effective teaching practices
with regard to instruction, curriculum, and
environment
100
Instruction, Curriculum, Environment
101
Talk time
• What effective teaching practices did you see
related to instruction, curriculum, &
environment?
• What questions/concerns/suggestions might
you have for this teacher?
102
Assessment ≠ Testing ≠ Evaluation
*Testing – “administering a particular set of
questions to an individual to obtain a score”
*Assessment – “the process of collecting data for
the purpose of making decisions about students”
**Evaluation – “procedures used to determine
whether the child has a disability, and the nature
and extent of the special education and related
services that the child needs.”
*Salvia & Ysseldyke, 2004
**Oregon Administrative Rules, 581-015-2000
Assessment
103
Assessment: RIOT
R – Review
I – Interview
O – Observe
T – Test
104
Hypothesis Development
Instruction:
Curriculum:
Environment:
Learner:
105
Instruction
• Thinking about RIOT procedures, what are
some ways we can gather information
about Instruction?
R – Review
Examine lesson plans, attendance,
permanent products for instructional
demands
I – Interview
Talk to teachers about expectations,
instructional strategies used
O – Observe
Observe instruction in the classroom for
effective instructional practices
T – Test
Aggregate test scores of classroom
106
Instruction: Examples
Targets for Intervention
Who knows…?
I do, we do,
y’all do, you do
1-2 OTR’s/min
8-12 OTR’s/min
<50% errors
corrected
95-100% errors
corrected
107
Is this effective instruction?
Is this effective instruction?
When it comes to interventions…
“It is clear that the program is less important
than how it is delivered, with the most
impressive gains associated with more
intensity and an explicit, systematic
delivery”
Fletcher & Colleagues, 2007
110
Instruction Resources
Explicit Instruction – Archer & Hughes (2011)
– www.explicitinstruction.org
Teaching Reading Sourcebook - CORE
– http://www.corelearn.com/
Classroom Instruction that Works: ResearchBased Strategies for Increasing Student
Achievement – Marzano et al, (2001)
111
Curriculum
• Thinking about RIOT procedures, what are
some ways we can gather information
about Curriculum?
R – Review
I – Interview
O – Observe
T – Test
Examine permanent products for skills
taught, scope & sequence, instructional
match
Talk to teachers, administrators about
philosophy of curriculum, coverage, etc.
Student success rate
Readability of textbooks
112
Curriculum: Examples
Targets for Intervention
Not matched to need
Frustrational
(<80%)
Weak (<80%)
Matched to need
Instructional
(>80-90%)
Strong (>80%)
113
Reading Skills Build on Each Other
Reading Comprehension
Phonics
(Alphabetic Principle)
Vocabulary
Oral Reading
Accuracy & Fluency
Phonemic Awareness
114
Environment
• Thinking about RIOT procedures, what are
some ways we can gather information
about Environment?
R – Review
I – Interview
O – Observe
T – Test
Examine school rules, attendance, class
size
Talk to teachers about expectations,
rules, behavior management system,
classroom culture, talk to parents
Observe in the classroom
Aggregate test scores of classroom
115
Environment: Examples
Targets for Intervention
Not defined
Low rate of
reinforcement
Chaotic
& distracting
Explicitly taught &
reinforced
Mostly positive
(4:1)
Organized &
distraction-free
116
Academic Learning Time: Typical School
Hours
1170
- 65
= 1105
- 270
= 835
- 209
=
=
=
626
157
469
94
375
School Year (6.5 hours x 180 days)
Absenteeism (1 day/month x 10 months)
Attendance Time (Time in School)
Non-instructional time (1.5 hrs./day for recess, lunch, etc)
Allocated Time (Time scheduled for teaching)
(25% of allocated time for administration,
transition, discipline-15 minutes/hour)
Instructional time (time actually teaching)
Time off task (Engaged 75% of time)
Engaged Time (On task)
Unsuccessful Engaged Time (Success Rate 80%)
Academic Learning Time
Efficiency Rating =11732%
Education Resources Inc., 2005
Academic Learning Time: Effective School
Hours
1170
- 65
= 1105
- 270
= 835
- 125
= 710
71
= 639
64
= 575
School Year (6.5 hours x 180 days)
Absenteeism (1 day/month x 10 months)
Attendance Time (Time in School)
Non-instructional time (1.5 hrs./day for recess, lunch, etc)
Allocated Time (Time scheduled for teaching)
(15% of allocated time for administration,
transition, discipline-9 minutes/hour)
Instructional time (actually teaching-710 vs. 626)
Time off task (Engaged 90% of time)
Engaged Time (639 vs. 469 On task)
Unsuccessful Engaged Time (Success Rate 90%)
Academic Learning Time
Efficiency Rating =11849%
Education Resources Inc., 2005
The Difference: Typical vs. Effective Schools
Typical
School
Effective
Time gained
School
Allocated Noninstructional
Time
25%
15%
+84
(15 min/hr)
(9 min/hr)
more hours
Engagement
Rate
75%
90%
+86
Success Rate
80%
Variable
more hours
90%
+30
more hours
Academic
Learning time
375 hours
575 hours
How the time is gained
Teaching expectations, teaching
transitions, managing appropriate and
inappropriate behavior efficiently
Better management of groups,
pacing
Appropriate placement, effective
teaching
= 200 more hours (53% more)
OR
95 more school days (4-5 months!)
119
Learner
• Thinking about RIOT procedures, what are
some ways we can gather information
about Learner?
R – Review
I – Interview
O – Observe
T – Test
Examine cumulative file, health records,
developmental history, etc
Talk to teachers, parents, student about
perceptions of the problem
Observe student in the classroom
Direct assessment
120
Learner: Examples
Poor attendance
Well below
benchmarks
Off-task, disruptive,
disengaged
Great attendance
At benchmarks
Focused
& attentive
121
Before considering additional testing
• Start with existing data:
– Screening data
– Progress monitoring data
– State testing data (OAKS)
– In curriculum data
• Is additional data needed?
– What additional questions do you have?
– Which diagnostic assessments can
answer those questions?
Assessment
122
Additional Resources
• Curriculum-Based
Evaluation: Teaching &
Decision Making
– Howell & Nolet
• CORE Assessing Reading
Multiple Measures
• Quick Phonics Screener
• DIBELS Deep
123
Hypothesis Development
Instruction:
Curriculum:
Environment:
Learner:
124
Hypothesis Development
• What can we do that will reduce the
problem (decrease the gap between what
is expected and what is occurring)?
Expected performance
Current performance
125
Problem Hypothesis
• Why is the problem occurring?
• Example:
– Harry’s reading fluency and comprehension
problems occur because he lacks strategies for
decoding silent-e words and vowel digraphs
(oa, ea, ae, ou, etc). His current instruction
does not provide enough explicit modeling of
these skills. He also currently has a low level of
engagement and is highly distracted in both
his classroom and intervention room.
126
Prediction Statement
• What will make the problem better?
• Example:
– Harry will improve if he receives explicit
instruction in his identified missing skills. He
also needs instruction that utilizes high pacing
and effective active engagement strategies to
keep him highly engaged in instruction, and
an environment that is quiet, without
distraction from other students.
127
Problem Hypothesis & Prediction
128
Step 3: Plan Development
1. Problem
Identification
Improved
Student
Achievement
What are we
going to do
about the
problem?
3. Plan
Development
2. Problem
Analysis
Consultation
129
Intervention Plan
130
Progress Monitoring Plan
131
Fidelity Monitoring Plan
132
Fidelity checklist
133
Importance of Feedback
• Wickstrom et al studied 33 intervention
cases.
• Teachers agreed to do an intervention and
were then observed in class.
• 0/33 Teachers had fidelity above 10%.
• 33/33 on a self report measure indicated
that they had used the intervention as
specified by the team.
Slide taken from a presentation by Joseph Witt
Consultation
Importance of Feedback
“Among the most powerful of interventions is
feedback or formative evaluation – providing
information to the teacher as to where he or
she is going, how he or she is going there, and
where he or she needs to go next” Hattie, 2012
(Visible Learning for Teachers)
“Feedback is the breakfast of champions”
Kevin Feldman
Consultation
135
Step 4: Plan Implementation &
Evaluation
How is it
working?
4. Plan
Implementation
& Evaluation
1. Problem
Identification
Improved
Student
Achievement
2. Problem
Analysis
3. Plan
Development
136
Attendance
137
Fidelity Data
138
Progress Monitoring Data…
139
…as compared to peers/expected growth
140
Cohort Data
60
50
40
Isaiah
Aimline
Mary
30
Amy
20
10
Chase
Dec .
S cores
J an.
S cores
F eb.
S cores
Marc h
S cores
A pril
S cores
May
S cores
J une
S cores
141
Cohort Data
60
50
40
Aimline
30
20
Amy
Isaiah
10
Chase
Mary
Dec .
S cores
J an.
S cores
F eb.
S cores
Marc h
S cores
A pril
S cores
May
S cores
J une
S cores
142
Magnitude of Discrepancy
143
Next Steps:
Based on Data & District Policies & Procedures
144
Final Thought: Data, Data, Data
145
Questions/Comments
Jon Potter [email protected]
Lisa Bates [email protected]
David Putnam [email protected]
146
Descargar

www2.oregonrti.org