Advanced Data Conversations
Laura Otten
Anna Harms
Melissa Nantais
Jennifer Rollenhagen
MiBLSi State Conference, 2011
Agenda
• Types of Data and Purposes of Assessment
• Considerations around Integrating Data
Systems
• The Continuous School Improvement Process
and How MiBLSi Data Fits In
• Effectively Integrating Data into your School
Improvement Plan
Acknowledgements
• MiBLSi Staff and State Trainers
• MDE Office of School Improvement
•
Anna Harms
–
•
Melissa Nantais
–
•
Training Coordinator, MiBLSi
Laura Otten
–
•
Evaluation Coordinator, MiBLSi
School Improvement Consultant, Kent ISD
Jennifer Rollenhagen
–
TAP, Behavior Data Systems Coordinator, MiBLSi
Materials
• Action Plan
• School Data
Profile /
Analysis
Document
Agenda
• Types of Data and Purposes of Assessment
• Considerations around Integrating Data
Systems
• The Continuous School Improvement Process
and How MiBLSi Data Fits In
• Effectively Integrating Data into your School
Improvement Plan
Types of Data
Demographic or Contextual Data
Describes our students, staff,
building, and community
Achievement/Student Outcome Data
How our students perform on local,
state, and federal assessments
(subgroups)
Process Data
The policies, procedures, and systems
we have in place that define how we do
business
Perception Data
Opinions of staff, parents, community and
students regarding our school
Summative district and state
assessments
Annually
Data about people, practices,
perceptions
Benchmark, common
assessments
2-4 Times a year
Quarterly or end of the
unit
Formative common assessments
Formative classroom
assessments for learning
1-4 times a month
Daily/
weekly
Examples of Data
Demographic or Contextual
Data
Student Subgroups
Enrollment
Attendance
Parent Involvement
Teaching Staff
Process Data
Policies and Procedures
School Process Rubrics (40 or 90)
SA or SAR (NCA)
Achievement/Student Outcome
Data
Local Assessments
State Assessments
National Assessments
Perception Data
Survey Data
Opinions
Student Outcome Measures
• Answer the questions:
– What are the skills of our students?
– How are different groups of students doing?
Having this information helps us to determine if
what we are doing is making a difference for
our students.
4 Types of Outcome Assessments
TYPE
Universal Screening
(Formative)
Progress Monitoring
(Formative)
USE
PURPOSE
“First Alert”
identify children who need more
intense
assessment
to determine
DIBELS,
AIMSweb
and SWIS are
the potential for intervention.
examples of tools that allow for
universal screening and progress
in Charts”
“Growth
usemonitoring
information of
to student
determineoutcomes
student
progress
and to plan
reading
and behavior.
differentiated instruction.
Diagnostic
(Formative)
use information to plan instruction, “In-depth View”
including intensive intervention
strategies.
Outcome
(Summative)
Evaluate student performance
after instruction is completed.
“Reaching our goals”
MiBLSi Data Tools
Behavior
• PBIS Self-assessment
Survey (PBIS-SAS)
• PBIS Team Implementation
Checklist (PBIS-TIC)
Systems/
Process/
Fidelity
• Schoolwide Evaluation Tool
(SET)
• Planning and Evaluation Tool
for Effective Schoolwide
Reading Programs (PET) or
Schoolwide Evaluation and
Planning Tool for Middle
School Literacy (SWEPT)
• MiBLSi Reading Support
• Benchmarks of Quality (BoQ)
Implementation Checklist (RTIC)
• Benchmarks for Advanced
Tiers (BAT)
• Discipline Referrals
• Suspensions/Expulsions
Outcomes
Reading
• Response to Tier 2/3
Interventions
• Reading CBM Data
(screening and progress
monitoring)
Examples of Data
Demographic or Contextual
Data
Student Subgroups
Enrollment
Attendance
Parent Involvement
Teaching Staff
Process Data
Policies and Procedures
School Process Rubrics (40 or 90)
SA or SAR (NCA)
Achievement/Student Outcome
Data
Local Assessments
State Assessments
National Assessments
Perception Data
Survey Data
Opinions
Systems/Process/Fidelity Measures
• Answer the questions:
– Are we doing what we have learned?
– How well are we doing it, when, where,
who?
Having this information helps us to accurately
interpret our student outcomes.
MiBLSi Data Tools
Behavior
• PBIS Self-assessment
Survey (PBIS-SAS)
• PBIS Team Implementation
Checklist (PBIS-TIC)
Systems/
Process/
Fidelity
• Schoolwide Evaluation Tool
(SET)
• Planning and Evaluation Tool
for Effective Schoolwide
Reading Programs (PET) or
Schoolwide Evaluation and
Planning Tool for Middle
School Literacy (SWEPT)
• MiBLSi Reading Support
• Benchmarks of Quality (BoQ)
Implementation Checklist (RTIC)
• Benchmarks for Advanced
Tiers (BAT)
• Discipline Referrals
• Suspensions/Expulsions
Outcomes
Reading
• Response to Tier 2/3
Interventions
• Reading CBM Data
(screening and progress
monitoring)
The Regional Data Initiatives Grant
The Regional Data
Initiatives (RDI) project is
an $11.5 million effort
funded by the American
Recovery and
Reinvestment Act (ARRA).
RDI will provide Michigan
teachers with real-time
access to student data at
the classroom level to
inform instructional
decisions.
The RDI project encompasses 97.5% of Local
Education Agency districts and 45% of Public School
Academy districts within all 57 Intermediate School
Districts.
Ensuring Useful, Relevant Data
• Graphic data tools provide a teams with the ability to get a
quick overview of trends
• Data collection should be easy (<1% of staff time) so that
we can spend the majority of our time acting upon the
data, not collecting it
• We must ensure the validity and reliability of our data (the
measures themselves and how they are collected)
• Data must be “triangulated” (look across multiple data
sources for trends and converging evidence)
• Data must be relevant, timely, efficient and practical
• Data should be useful for and used for making decisions
Agenda
• Types of Data and Purposes of Assessment
• Considerations around Integrating Data
Systems
• The Continuous School Improvement Process
and How MiBLSi Data Fits In
• Effectively Integrating Data into your School
Improvement Plan
Benefits of Integrating Data Systems
• One-stop source of data.
• Could reduce time required to find data.
• Could increase time available to spend on
analyzing data and using it for decisionmaking.
• Could increase staff ownership and use of
data.
• Opportunity to examine the relation
between multiple sources of data.
• Potential for reduced cost over time.
Considerations
• Have a deep understanding of the
purpose and function of the different data
systems you are working with.
• Build an integrated data system that
reflects the best features of each system.
• Avoid taking something functional and
making it less functional “just ‘cause” it
seems like a good idea.
Discussion
– What do you see as some of the
critical features of the DIBELS,
AIMSweb, SWIS and PBIS Surveys
data systems?
– Write notes to yourself for 2 minutes
– Discuss your ideas with the person
next to you for 3 minutes.
Did your list look at all like this?
• Relatively inexpensive
• Web-based
• Multiple levels of access
• Both data entry and reporting
• GRAPHS, GRAPHS, GRAPHS,
GRAPHS, GRAPHS, GRAPHS!
A specific example:
Integrating the Schoolwide
Information System (SWIS) data
with district data systems
Data Integration of SWIS
SWIS
Referral
District
Database
Sharing Ideas and Discussion
What’s Working
• Double-Entry
– Reframe purpose of applications
• S-DEX
– PowerSchool
– Pentamation
– eSIS
SWIS
Referral
District
Database
Sharing Ideas and Discussion
What’s Not Working
Transferring Referrals into SWIS
• ASIST
– Decrease in data quality
– Decrease in data integrity
– High level of on-going support at school, district,
facilitator and SWIS levels
• SIF
– Low adoption from major SIS vendors
– Expensive $$$
– No known examples of referral transfer
Sharing Ideas and Discussion
Where are We Headed?
• ECS Distributor
– Data warehouses
• SIF…
• Behavior and Literacy
SWIS
Referral
Data
Warehouse
Discussion
– What conversations and experiences are
you having in your district around
integrating data systems?
– To what extent are you working with
separate vs. integrated data systems?
– What are your hopes for the future?
– What are your concerns?
Agenda
• Types of Data and Purposes of Assessment
• Considerations around Integrating Data
Systems
• The Continuous School Improvement Process
and How MiBLSi Data Fits In
• Effectively Integrating Data into your School
Improvement Plan
School Improvement is the Process that
supports the Development and Maintenance
of a Response to Intervention System
Gather
Getting Ready
Collect School Data
Build School Profile
Do
Implement Plan
Monitor Plan
Evaluate Plan
Student
Achievement
RtI
Plan
Develop Action Plan
One Common Voice – One Plan
Study
Analyze Data
Set Goals
Set Measurable Objectives
Research Best Practice
School Data Profile/Analysis
“The School Data Profile/Analysis (SDP/A)
is a tool to assist school staff in
determining the strengths and needs for
improvement of their school based on an
analysis of data and response to a series
of data related questions…The SDP/A is
intended to support deeper dialogue about
the data and information, and to draw
thoughtful conclusions about the areas of
need.”
(SDP/A, p.2)
Connecting MiBLSi Data to SDP/A
• Mobility & Attendance
– Grade Level Attendance and Discipline: Data
Local Data Last Five Years
– Sub-Group Attendance and Discipline Data:
Local Data Last Five Years
– Question 3
– Question 4
Connecting MiBLSi Data to SDP/A
• Grade Level Achievement
– School Level Grade Level Achievement for
All Students
– Question 2
– Question 4
– Question 5
– Question 6
– Question 8
Connecting MiBLSi Data to SDP/A
• Students with Disabilities
– Question 6
– Question 7
Connecting MiBLSi Data to SDP/A
• Limited English Proficient
– Question 4
Connecting MiBLSi Data to SDP/A
• Perception Data: Teacher/Staff
– Question 2
– Question 3
– Question 5
– Question 6
Agenda
• Types of Data and Purposes of Assessment
• Considerations around Integrating Data
Systems
• The Continuous School Improvement Process
and How MiBLSi Data Fits In
• Effectively Integrating Data into your School
Improvement Plan
Breaking Down Your Plan
School Data Profile
Fidelity Data Fits
Goal
Gap Statement and Cause for Gap
Objective
Strategy
Objective
Strategy
Activity
Activity
Activity
Activity
Activity
Strategy
Activity
Student
Outcome
Data Fits
Is there a problem?
• What is a problem?
• A problem is a gap between
the desired outcome or
objective and current status.
Precise Problem/Gap Statements
(What are the data we need for a decision?)
• Precise problem statements include
information about the Big Five questions:
– What is the problem and how often is it
happening?
– Where is it happening?
– Who is experiencing the problem?
– When is the problem most likely?
– Why do we see this problem and what is
sustaining it?
Problem Statement
We have high rates of physical aggression, disrespect and
inappropriate language on the playground
Costs:
Planning:
Supervisor Meeting costs 12 people 60 min
3-4 people about 2 hours each for planning details
One hour of administrative time to schedule
Total time = 21 hours
Implementation:
All supervisors spend a half day away from regular duties
All students spend 30 minutes of classroom instruction time
All teachers spend 30 classroom instructional minutes
Total time supervisor time= ~ 15 hours
instructional time = ~ 60 min
Horner & per
Todd,
2010level
grade
Problem Statement
We have high rates of physical aggression, disrespect and
inappropriate language on the playground during second and
third grade recess.
Benefits Resulting from defining the problem
with more precision:
Narrowed focus from whole school to 2nd & 3rd grade teachers,
supervisors and students.
Total planning time = ~ 7 hours
Total implementation time = ~ 30 minutes
K, 1st, 4th, 5th grade teachers, supervisors and students maintained
their regular schedule recouping 30 min instructional time per
grade level
Horner & Todd, 2010
Problem Statement
We have high rates of physical aggression, disrespect and
inappropriate language on the playground during second and
third grade recess. Many students are involved and it
appears they are trying to get access to equipment/games
Benefits resulting from defining the problem
with more precision:
Solution implementation decisions are more specific, function-based and
have contextual fit
(focused on equipment/games)
Provides opportunities for better instruction
Prevents further planning and loss of instructional time
Horner & Todd, 2010
It’s NOT about admiring the
problem, it’s about fully
understanding the problem so that
the solution can be well-designed.
Sample Precise Problem/Gap
Statement
What is the problem?
Use
62%your
of students
studentschoolwide
outcome data
areto
meeting
make a
general
DIBELSstatement
benchmark
about
goalsthe
at problem/gap.
the end of the
Explain
school year
current
compared
status intorelation
our goaltoofa at least
goal/objective.
80% of students.
Who is experiencing the
problem?
90% of
K students
are at
benchmark.
What
grade
levels and
subgroups
are Fewer
affected?
students meet benchmark goals after K.
Where is it happening?
Is
the
consistent
in allMrs.
classrooms
Not
allproblem/gap
classrooms are
the same.
Stewart
and
schools has
in the
district?
consistently
over
85% of her first grade
students at benchmark.
When is the problem
most likely?
Consider
time
you see
your
There is athe
drop
in of
theyear
% ofwhen
first grade
students
problems
up. Arefrom
there
implications
who scoreshowing
at benchmark
Fall
to Winter. for
your instructional scope and sequence?
Why do we see this
problem and what is
sustaining it?
PET-Ryour
data systems/process/fidelity
indicate that we have weaknesses
in our
core as
Use
data,
such
reading curriculum, including fidelity, coverage of big ideas at
the
PET-R. Start layering in behavior right here.
critical times, and use of explicit instruction. Discipline referrals
are above the national median and implementation of SWPBIS
is around 60% on the PBIS-SAS and PBIS-TIC, which means
that behavior problems are likely affecting academic
outcomes.
Sample Precise Problem/Gap
Statement (behavior layer)
What is the problem?
Use
Average
your referrals/day/month
student outcome data
areto.68
make
above
a general
the
statement
national median
about (.22)
the problem/gap.
and have been
Explain
increasing
current
status
since December.
in relation to
Problem
a goal/objective.
behaviors are primarily
fighting/agreesion and disrespect.
Who is experiencing the
problem?
90% of
students
have
0-1 referrals.
Referrals
What
grade
levels,
classes,
subgroups
and seem
to be coming
from mostly
grade students.
Black
students
are affected?
Use5threferrals
by student
students
are being
over-referred.
report,
grade
and class
custom reports and
ethnicity report.
Where is it happening?
When is the problem
most likely?
Why do we see this
problem and what is
sustaining it?
Use referrals by location report.
Majority of referrals are coming from the
classroom, which is typical. Next most common
location is in the upper L hallway.
Use the referrals by time report.
There is a spike in referrals during 5th grade
break in the afternoon.
Use
your systems/process/fidelity data, such as
PBIS-SAS and PBIS-TIC data show a need to improve
the
PBIS-SAS,
PBIS-TIC,
and
BoQ and SET.
teaching
of expectations,
use of the
acknowledgement
system
and using data for decision making.
Pre-Correction
• Cause for gap statements should only address
things that are within the control of school staff
to change.
• Things that should not be listed as a cause for
gap:
– Student skill deficits
– Students’ race/ethnicity, disability status,
socio-economic status, home support
Discussion
– Given the following precise problem/gap
statement, draft a cause for gap
statement. Explain why we see fewer
students at benchmark than we want.
Breaking Down Your Plan
School Data Profile
Fidelity Data Fits
Goal
Gap Statement and Cause for Gap
Objective
Strategy
Objective
Strategy
Activity
Activity
Activity
Activity
Activity
Strategy
Activity
Student
Outcome
Data Fits
Establishing Priorities
• After studying your data, you should have
a strong understanding of what’s
happening.
• You will not need to address every single
cause for gap.
• The trick will be picking the key pieces to
focus on that will create the most
meaningful and substantial change.
Process for Establishing
SMART Reading Objectives
• The leadership team and teachers look at their data.
• Staff identify the number of students who they believe
can reach the benchmark goal by the end of the school
year.
• Staff identify the students they feel with appropriate
intervention, will be able to move to needing only
universal or strategic supports.
• The total number of students identified by the
school/grade level is calculated into a percentage.
• Every year this is done again BUT you cannot aim for a
LOWER percentage.
Creating SMART Objectives
SPECIFIC
clearly defined, beyond global statements
MEASUREABLE
tied to data which allows for objective evaluation
ATTAINABLE
willing and able to do but still challenging
RESULTS-BASED
tied to important outcomes
TIME-BOUND
set a specific time frame
Pick the SMART Objective
• The percentage of third grade students
scoring in the proficient categories on the
Reading MEAP will increase from 44% to
60% by fall of 2012.
• We will increase the percentage of
students who pass the MEAP next year.
Pick the SMART Objective
• The percent of third grade students
scoring proficient in the informational text
strand of the MEAP will increase from 56%
to 100% by 2011.
• The percent of third grade students
scoring proficient in the informational text
strand of the MEAP will increase from 56%
to 75% by 2011.
Pick the SMART Objective
• The percent of red students in second
grade will decrease from 34% to 10% by
May.
• The percent of second grade students
scoring at the intensive instructional level
in oral reading fluency will decrease from
34% to 10% by May, 2012.
Discussion
Take your problem/gap statement
and cause for gap statement and
use it to develop SMART
objectives.
Breaking Down Your Plan
School Data Profile
Fidelity Data Fits
Goal
Gap Statement and Cause for Gap
Objective
Strategy
Objective
Strategy
Activity
Activity
Activity
Activity
Activity
Strategy
Activity
Student
Outcome
Data Fits
Identifying Strategies and Activities
• After you’ve developed your precise
problem/gap statement and cause for
gap statement, you should be fully
prepared to identify strategies and
activities.
• Address all causes for the gap within
your strategies and activities.
Points of Clarification
• Behavior cannot be listed as primary goal in
the School Improvement Plan. Instead, it
should be listed as a STRATEGY.
• Logic:
– The real goal we are all working toward is
improved student achievement.
– Improving student behavior is therefore not
a goal in of itself, but a means to get to
improved student achievement.
Layering in Behavior
Problem/gap statement
based on academic student
outcome data
Cause for gap based on
implementation holes within
the schoolwide reading
model
Cause for gap based on
student behavior and
implementation holes within
the schoolwide PBIS model
Strategies and activities that
support improved student
outcomes through improved
implementation of the
schoolwide reading model
Strategies and activities that
support improved student
outcomes through improved
implementation of the
schoolwide PBIS model
Discussion
– Begin developing ideas for
strategies and activities.
– Can you draw a line through your
whole plan from activity to strategy
to objective, to cause for gap
statement, to gap statement, to
goal? (and reverse it too?)
Concluding Big Ideas
• We need to work smarter. One way to
work smarter is to streamline our work
and integrate our MiBLSi efforts with
school improvement.
• Stay focused on the data and stay
focused on the big ideas.
• Thank you!
Descargar

Types of Data