Please use the comments for discussion and to contribute your reviews, perspective and thoughts. Your colleagues and other visitors will appreciate it! If you need help, please contact us. Requests for help will not be answered in comments.
>> The broadcast is now starting.
All attendees are in listen only mode.
>> Hi, I want to thank you for joining us to today's webinar entitled Progress Monitoring
for Students with IEPs: An Introduction.
Hopefully you were able to locate the handouts that can be found
on the Pattan website at www.pattan.net.
Going to the link that says training and then going to calendar
if you find today's date you can click on the name of the webinar
and you can download the handouts either to your hard drive, or you can print them.
My name is Diane Funsten, and I am an educational consultant
at the Pattan office in the King of Prussia.
I'm going to be joined today by a colleague at our Harrisburg office by the name
of Karen Grammas, and Karen and I will be sharing the presentation today.
We really feel there's a need in the field of special education
for information on progress monitoring.
We're hoping to add to the information we provided last year in our webinar series
that focused on IEPs for students with different kinds of needs.
So we did sessions on reading and writing, writing IEPs for students with math deficits,
with behavior deficits, and students for whom assistive technology needs to be considered.
Those sessions were recorded and are archived and can be found on the Pattan website
under the link that's called videos.
Today's session will provide basic information
about progress monitoring including regulatory requirements and is kind of the kickoff
to our series about progress monitoring.
And this series will address similar topics as what we did last year.
So we'll be talking about progress monitoring for students with reading needs
and with writing needs, with math needs, with behavior needs and students
who are receiving assistive technology services.
These sessions will all be recorded and archived as well.
If you want to see the list of our upcoming webinars you can go
to the training calendar again, and this time click on November 26th which is
when our next session which is on progress monitoring with writing,
and you can click on the link that says brochure, and you'll get a list
of the dates of our upcoming webinars.
We really have a lot to cover today, and we're kind of short on time.
This sessions is really only scheduled to go until 4:30.
So if we have time at the end of the session for questions we will try to take them.
Otherwise please feel free to email Karen or me and we will respond to your questions.
Pattan supports the efforts and initiatives of the Bureau of Special Education,
and we strive to build local capacity of entities that serve students with IEPs.
And, of course, our goal for each child is to ensure that a student's IEP team begins
with the general education setting with the use of supplementary aids and services
as a possible placement before considering a more restrictive environment.
Here's what we're hoping to accomplish with you today.
We're hoping that you will gain an understanding of the purpose of doing progress monitoring,
why do we do it and what are some of the requirements
around progress monitoring in the IEP process.
We're hoping you'll identify characteristics of different kinds
of progress monitoring procedures.
We're going to show you some examples of graphs and talk to you
about analyzing progress on a graph.
And we're going to offer some resources about progress monitoring throughout the presentation.
Some will be linked to the Pattan website and others will be national resources.
So according to IDEA and state regulations progress monitoring procedures must be
established for each IEP goal.
Progress monitoring is the method of formative assessment used
to measure students' progress toward meeting a goal.
Called formative because we make changes based on the information that we gather.
The main purposes of progress monitoring are to describe the rate of response to instruction
and to build more effective programs for students.
Progress monitoring procedures guide how data will be collected in order
to make instructional decisions about the progress of the student,
and establish a decision making plan for examining the data that are collected.
Progress monitoring assists the teacher or service provider
in making ongoing instructional decisions about the strategies being used.
It also provides summative evidence that enables the IEP team
to determine whether the student has achieved his or her goals.
We collect progress monitoring data to answer these questions.
Is the student making progress at an acceptable rate?
Are they going to reach their IEP goals?
And is the student, in fact, meeting those goals?
Does instructional intervention need to be changed or adjusted based
on the information that we're getting?
So progress monitoring is always about the effectiveness about our intervention.
In other words, is what we are doing working?
You can think of effective progress monitoring as having these characteristics.
Are we measuring the behavior as specified in the goal?
Are we using equivalent measures each time?
We want to be able to compare similar measures.
Are we providing regular and frequent data collection?
We need to know is our instruction working so we need to look at data frequently.
Are the tools easy to implement?
Because if they're not we will end up not doing them.
And they can't take a lot of time away from our instruction.
It's very important that the effective progress monitoring allows
for analysis of performance over time.
We're actually looking at performance across the life of that IEP goal.
Again, under federal regulations the IEP must contain a description
of those things that you see on the screen.
We have to look at how the child's progress toward meeting the goals will be measured
and when periodic reports on the progress the child is making will be provided.
And this is done through the use of quarterly or other periodic reports concurrent
with the issuance of report cards.
Federal regulations also require that an IEP contain a statement
of measurable annual goals including academic and functional goals that are designed
to meet the child's needs to enable him or her to be involved in
and make progress in the general ed curriculum.
So we look to align our IEP goals with the same content that students
without disabilities have access to.
And when the IEP is standards aligned the monitoring of progress is direct and purposeful
and focuses on progress in the general ed curriculum.
And so the purpose of progress monitoring in a standards aligned system is
to determine progress in the general ed curriculum.
This means we are using mastery of subject content as defined
by the PA course standards and PA academic standards.
This may be a different way of looking at progress monitoring
from the way folks have been doing progress monitoring for IEPs.
Remember that many of our students with IEPs have still deficits in areas
that are not aligned to state standards.
We're going to be talking about different kinds of measures for monitoring student progress.
One item or type of measure may not always be able to measure an entire goal.
The IEP should be looking at summative, formative, diagnostic and benchmark data
as multiple sources of progress and a way to report information to parents.
More information about those four types of assessments can be found
on the SAIEPs portal on the PVE website.
Also, we won't have time to review the development of standards aligned IEP,
but a webinar was provided on the topic about a year ago.
It was recorded and archived on the Pattan website, and we encourage you
to go back and take a look at it.
The link to that presentation is on the slide.
Mastery measures and general outcome measures are two common progress monitoring approaches.
One key difference between mastery measures
and general outcome measures is the ability to look at data across time.
With general outcome measures you can compare the score a student received in May
to the score he or she had in September.
This cannot be done with mastery measures because each subskill is tracked separately.
And we're going to begin by talking about mastery measures in more detail.
Mastery measures, and we might also refer to them
as specific skills measures, are not empirical.
That is they're not provable or verifiable by experience or experiment.
They often are derived by task analysis.
Mastery measures measure a series of what you might think of as short-term objectives.
So you might think of a sequence of addition skills starting with single digit addition
with no regrouping to single addition digit with regrouping,
and then moving to multi-digit addition without regrouping
to multi-digit addition with regrouping.
There are technical problems with using mastery measures to quantify progress across objectives.
First of al they don't look at maintenance of skills over time.
We're really talking about looking at, for example, single digit addition
without regrouping as a skill that has to be mastered before we move to the next skill.
Because teachers often develop mastery measures they really don't have reliability and validity
as we would see in commercial provided assessments.
And the objectives are not equivalent units.
So testing a student for single digit addition without regrouping is very different
than measuring multiple addition, multiple digit addition without regrouping.
It's important to note that mastery measures or specific skills measures can be used
to measure both academic as well as functional skills, and they play an important part
in progress monitoring for our students with IEPs.
On the slide you see an example of using mastery measurement to assess addition.
This assessment includes the addition of two and three digit numbers with and without regrouping.
You can see that basically all of the problem are the same type.
Mastery measures assess a skill taught at a time.
Here's an example of how data from the mastery measurement assessment
on the previous slide is graphed.
You're looking at a graph of a number of multiple digit addition problems correct
across the X axis which is biweekly probes.
Notice that the A line is drawn across the expected performance on the IEP goal,
in this case 16 out of 20 correct.
We will go into more detail on using decision rules to determine
when a student has reached his or her goal.
The other common approach to measuring student progress is using general outcome measure
which are a specific set of testing strategies for academic skills assessment.
General outcome measures are standardized norm referenced assessments that are reliable
and valid indicators of student achievement.
In many ways they're very different than mastery measures.
General outcome measures reflect overall competence at a specific grade level curriculum,
for example, third grade reading or sixth grade math.
They're made up of tasks of about equal difficulty that are given throughout the year
so that growth toward a final goal can be measured.
General outcome measures can be used as both screening and progress monitoring measures.
Many curriculum based measures or CBMs are types of general outcome measures.
And you're probably fairly familiar with CBM.
General outcome measures are sensitive to the improvement of student achievement over time.
And they provide curriculum linked assessment information
that helps teachers better plan instruction.
This is a sample goal utilizing a general outcome approach.
[ Pause ]
Each progress monitoring probe Jose will be given will contain the same number of problems
and will all be taken from the third grade math curriculum.
Jose's improving performance on the probes will indicate
that he is generally getting better at third grade math.
It won't tell us specifically that he's getting better at two digit multiplication
or that he's getting better on fractions.
That's really where mastery measurement comes in.
But it does tell us that the math instruction his teacher is providing is working,
and hopefully it's closing his achievement gap.
Again, this is different from mastery measurement which might look at success
on math facts than on single digit multiplication and then
on multi-digit multiplication and so forth.
But both types of measures are important.
Here is an example of a general outcome assessment that Jose might be given.
A general outcome measure reflects all skills in a year long curriculum
with random placement of problem types.
So you'll see by looking at this slide that there are lots of different kinds of math skills
that are being assessed on the same probe.
By assessing all of the objectives
in that curriculum general outcome measures will be sensitive to growth
as more skills are taught regardless of the order in which they are taught.
General outcome measures allow teachers to determine
if students are retaining the skills they've been taught and generalizing to skills
that they have not yet been taught.
And here's an example of how Jose's performance on those assessments is graphed.
You're looking at weekly assessment of number of digits correct per minutes,
and the goal line is drawn from the student's baseline at five digits correct per minute
to the benchmark established by the developer of the assessment tool
which is 30 digits correct per minutes.
And students' scores will be plotted and analyzed against that goal line.
Again, we will go into more detail on the analysis of data.
And I can see from looking at my screen
that the dotted line should go down to the five but it doesn't.
So you can sort of mark that in yourself.
Notice that the goal line is drawn this way on a general outcome measure.
On a mastery measurement measure it's drawn across the criterion line.
We're going to be following a seven step progress monitoring process.
And the general goal of this approach is
to measure progress toward the annual goals and objectives on an IEP.
So we're going to be starting with measurable annual goals and objectives,
looking at data collection decisions and data collection tools, look at graphing those data,
evaluating those data, determining when we have to adjust instruction,
and then communicating that information to parents.
The first step in writing measurable annual goals is to --
the first step is writing measurable annual goals.
And when required for students who take the alternate assessment you have
to write short-term objectives.
But certainly teachers can always choose
to write short-term objectives for any of your students.
Let's take a look at step 1.
When we write annual goals and short-term objectives it's important
to include these four criteria.
The condition, so for example given mixed calculation problems representing the second
The condition describes the situation in which the student will perform the behavior.
So it's sort of like setting up the assessment situation.
Also, we need to include, of course, the student's name,
the student's clearly defined behavior which is part of the goal.
And here we're looking for behavior that is measurable and observable.
So in this example we're talking about Jose will calculate.
And then we have the performance criteria that actually have a couple of parts to them.
One would be the criterion level which is the performance level the student must demonstrate.
In this example it's 30 digits correct per minute.
We need to talk about the number or the number
of times the behavior must be performed to reach mastery.
So here we're talking about three out of four consecutive trials.
If Jose reached 30 digits correct per minute once that's not enough
to say that he has reached the goal.
He really needs to show that performance on three out of four consecutive trials.
And then we have to put in the evaluation schedule
which would be how frequently we're going to assess that skill.
In this case we might be looking at consecutive daily trials or consecutive weekly trials.
And there's a little picture of a publication
that Pattan has put together called writing effective IEP goals that's available
as a publication on the Pattan website.
And that will assist you and give you a little bit more information on those four criteria.
Here are the criteria if we were doing a mastery measurement or a specific skills goal.
So the condition is during morning circle.
The student's name is Joann.
She will sit with no support, and she needs to do this for 15 minutes but not just once,
she needs to sit for 15 minutes for five consecutive days.
Those are those four criteria that are really important and need to be included in every one
of your measurable annual goals and short-term objectives if you're writing them.
Goals are set differently with general outcome measurement
and for mastery or specific skills measurement.
Here we're going to look at using curriculum based measure to set goals.
With curriculum based measure assessment tool developers establish a set of benchmarks
that students are expected to achieve across the school year.
They also suggest a rate of improvement also thought of as slope
at which a student should make progress.
The benchmarks help teachers decide when instruction is resulting
in students meeting the goals and when goals should be raised.
Notice that we never lower goals.
Instead we intensify and/or change instructions.
For mastery or specific skills measurement there are no benchmarks or suggested goals.
Provided instead teachers collect baseline data on student skills
and determine how much progress the student will make over the course of the year.
Student performance on the selected goal is assessed and graphed.
You see the suggestion on the slide that if the student is going to be receiving ESY services
that you set up the X axis of your graph across all 52 weeks.
There are three different ways to end of the year goals using curriculum based measures.
One way is to use end of year benchmarks and, again,
we're focusing on general outcome measures.
Here's an example of benchmarks for math computation and for math concepts
and applications across grades 1 through 6 as established
by the developers of a math CBM tool.
So you can see that at the end of second grade students should be scoring
at about 45 words correct per minute and should be filling in about 20 correct digits.
I'm sorry, I said that the other way around.
No, for concepts and applications about 20 blanks.
The second option for making an end of year performance goal is to use national norms.
This chart shows curriculum based measurement norms for student growth or slope by grade.
For computation CBM we're looking at slope for digits correct per minute.
For concepts in application CBM we're looking at slope for correct blanks.
Teachers can use a resource like this research based norms table
to identify the average expected rate of weekly increase for students at each grade level.
For example, this table indicates that a first grade student would be expected
to compute .35 additional digits correct each week on computation probes.
And a fourth grade students would be expected
to compute an additional .7 digits correct per minute on computation probes
and .7 additional blanks correct on concepts and application probes.
And we will be showing you a resource to help you compute end of the year goals.
[ Pause ]
Here you see DIBELS six addition benchmarks for each subtest that is given in each grade.
I'll just move this down a little bit.
And you can see the benchmarks are established for fall, winter and spring.
These benchmarks can be used to compute end of the year goals for reading.
So you can combine the skill identified in the grade 2 core standards
for English language arts to write a goal.
And the goal might say something like the student will know and apply grade 2 phonics
and word analysis skills to decode words as measured by a score of 90
or more words correct per minute on four out of six weekly grade 2 oral reading fluency probes.
So the goal is not so much a fluency goal as it is a phonics goal and a word attack goal
or word analysis goal that uses a curriculum based measure as its measurement.
Just get out of this.
[ Pause ]
Okay, oops, we get back to this again.
And get down to where I was before.
[ Pause ]
Okay, let's go to this.
Sorry about this.
Okay, here are rates of improvement for all reading fluency.
These can be used to calculate end of the year goals for reading as well.
The third option for setting end of year goals is called the intra-individual framework.
And it is often used for setting IEP goals for those students performing far below grade level.
Using this option the student is compared to his or her own performance,
not to a national or a local norm.
This calculation uses student rate of improvement, the number of weeks
in the instructional period, and the student baseline score.
But I have to tell you that the calculation is a little bit involved.
And given our time constraints we really can't walk you through this calculation.
But on the slide we offer some very helpful resources
for calculating goals using this technique.
Here is an example of establishing a goal using mastery measurement also known
as specific skills measurement.
So the goal has to do with a student being able to identify relevant details in order
to answer comprehension questions and/or sequence events with 80 percent
on five consecutive weekly probes.
There's actually two skills that are being looked at.
And so the teacher chooses to write short-term objectives.
The first one talking about providing an array of choices that includes distractors,
and that the student will identify relevant details in response to teacher prompts
with 80 percent accuracy across five consecutive weekly probes.
Another short-term objective is that the student will answer comprehension questions
with 80 percent accuracy on five consecutive weekly probes
when she is given relevant details.
And then you would probably have a third short-time objective that would focus
on using relevant details to use sequence events.
So at this point I'm going to turn the presentation over to Karen
who is going to move us to step 2.
>> Karen: Thank you, Diane.
So we're going to look at step 2 in our progress monitoring process.
And step 2 looks at making data collection decisions.
[ Pause ]
So, Diane, maybe you will have to advance the slides for me.
[ Pause ]
Oh, there we go.
Okay, the purpose for collecting this data actually is twofold.
First of all it serves at a day-to-day guide for making adjustments in the instruction.
And then it also serves to provide the information that's needed on student progress
when we do the annual review of the IEP or when we are going through the re-evaluation process.
So you want to look ahead at the type of questions that you want answered
when you're going through that IEP annual review or the re-eval process.
Selecting the tool used for progress monitoring should come after you've made some decisions.
So looking at this slide one of the things we have
to ask ourselves is what is the purpose for collecting the data.
You want to assess the criteria of performance that's meaningful for the skill
or the behavior that you are observing.
You might decide to measure more than one skill or behavior.
And for an example maybe you're looking at the quality of sitting.
Do you remember our example before where Mary will sit without support.
So we want to look at maybe the quality of sitting as well as the duration
that she's going to sit in that morning meaning.
They both may be important to measure so you want to have that consideration.
You also want to look at what type of data will be collected.
Here's a list of some examples.
You know, fluency, quality, percentage of accuracy, frequency or rate.
All of those things are possibilities when you're looking
at what type of data will be collected.
But you need to consider again what's the purpose, what cell you're looking at
and what is the best way to collect that.
Another consideration involves where the data will be collected.
So think about is it best to collect that data in the classroom,
or is this data that you're going to collect on the playground or in the cafeteria.
Perhaps for an older student is this data you're going to collect on a job site maybe?
So those considerations come into play as well.
Another consideration is how often will it be collected.
Are you going to do it daily, weekly, quarterly?
You saw some different examples in some of the goals and graphs that Diane presented.
And then you also wanted to think about who is going to collect that data.
This could be one of several people.
Obviously it could be a teacher who works with the student.
But it could be a job coach.
In some instances maybe it would be best for the paraprofessional to collect that data.
Or it might be somebody else.
You may even involve parents in some of that data collection.
So during the IEP meeting you can ask for parent input on what type
of data will be collected, how often.
But get the parents involved.
They should be part of this conversation.
In addition you might want to discuss with the parents some other options
for data collection outside of the school setting or their job setting.
So at home are these skills being generalized into the home settings,
the student transferring those skills into other settings.
And you might want to help the parents set up a data collection system that are outside
of school settings so that you can get additional information.
[ Pause ]
As we continue to move through this seven step process step 3 addresses then the data
collection tools that you might consider.
And we're going to look at some characteristics for some progress monitoring tools.
The progress monitoring allows teachers
to assess students' academic performance using brief measures and on a frequent basis and,
again, in order to make our best instructional decisions.
So we talked about some of the data decisions that need to be made such as what type
of data will need to be collected, where, how often, those kinds of things.
Some additional questions to help guide your selection
of a meaningful data collection tool include regarding reliability.
Are the scores accurate and consistent.
In terms of validity does the assessment actually measure the targeted skills
or behavior that you're intending?
How about the sensitivity to change?
Will the measure reveal long-term improvement?
Are you actually going to be able to see improvement as you move through?
Are there alternate forms?
Are there different versions of the assessment of comparable difficulty?
That's really important when you have many times that you need to progress monitor and check
on the progress of the student, that you want to have other forms available of the same tool
so that you can get the most accurate results.
And is there a description of benchmarks for adequate end
of year performance or the goal setting process?
So, again, just some additional questions to help guide your decisions.
[ Pause ]
So here are some -- I'm sorry.
Here are some tools that you might want to consider in addition
to some that we just talked about.
Because sometimes the tools that we use will be teacher developed, and they won't measure
up to the standards of technical adequacy that we just talked about on the previous slide.
Data collection tools must be selected or designed,
and a schedule to review that data has to be established.
So this slide shows some of the more commonly used data collection tools
and methods that we might come across.
Remember all the options when we think about using the four types
of assessment to monitor student progress.
And Diane had mentioned the summative, formative, diagnostic and benchmark tools,
and we'll look at a couple of those in a minute.
Our standards align system also ask us to use all of the effective assessment practices listed
on this slide, the structured interviews, rubrics, curriculum based assessments,
anecdotal records, things like that.
As well as pre and post assessments that you might administer.
Unit and theme tests that come from our reading and math series.
You might use writing samples.
You might do comprehension checks.
So all of those are ways to collect more data about a student.
[ Pause ]
Here's some examples of progress monitoring tools for each
of the different types of assessments.
So under summative assessment you can look at PSSA scores.
You might look at some district achievement tests that were developed.
Formative assessment a lot of the things that we just had on that previous slide, exit tickets,
work samples, checklists, rubrics, things such as that.
Here are some examples of diagnostic tools, the Gray oral reading test,
test of written language, the key math 3.
And a benchmark might be under study island assessments in reading or math,
words correct per minute, correct word sequence in writing, digits correct in math.
These are just some possibilities of things that could be reported on a student's progress.
The key in all of this is we want to make sure that these measures are going to inform the IEP
of the student's progress and the future direction that we need to go.
Be careful in how you display the data and analyze so that how it's understood
in terms of making those decisions.
How instruction is provided, and then how you adjust it and modify it.
[ Pause ]
This slide provides links to online charts depicting a selection of tools
for both general outcome measures and mastery measures.
So there's two places you can go.
We can go to rti4success or intensiveintervention as just two examples.
And, let's see, if we can go there.
When we had to here's under response to intervention you can see
that there's a nice chart here of progress monitoring tools.
So many of us are familiar with AIMSweb.
Let me get the headings back up here for you.
[ Pause ]
Okay. so many of us are familiar with AIMSweb.
And it will show you under the area the reliability of it, the validity,
are alternate forms available, is it sensitive to student improvement,
are there end of year benchmarks, all of those kinds of things.
That's a really nice place to go to just get a huge list
of tools that you might want to consider.
And then if you look up here you can go in and look at the mastery measures charts by clicking
on that link or by entering one of the links from this slide.
Okay? And, again, you can go to intensiveintervention
and get a similar chart of different tools.
[ Pause ]
Okay, so let's look at a couple of examples of a teacher who has a question
about choosing a progress monitoring tool.
So on this example the teacher says I'm a teacher with 25 students in my classroom.
I can't afford to set aside blocks of time
to administer progress monitoring probes to selected students.
Are there tools that can be administered by the paraprofessional in my classroom?
Well, the answer is yeah.
Understandably with how busy our lives
as teachers are this is a common question that comes up often.
And having a paraprofessional administer the progress monitoring tools certainly could be
You want to make sure that the guidelines set by your LEA are followed.
And we want to make sure that the paraprofessional
who is administering the tool is trained in the administration of that.
So check with your district, check with your building to make sure
that that is a viable option for you.
But definitely we can use somebody like a paraprofessional
to administer some of those tools.
Here's another example.
I am interested in finding a tool that I can use
to monitor my students' progress weekly or even more frequently.
Are there tools that have at least 20 alternative forms?
Okay. so here we talked about this earlier.
You need a selection of forms from the same tool that you're familiar with using.
So here are some examples again of some tools that might have alternate forms such as AIMSweb,
the DIBELS has alternate forms, Monitoring Basic Skills, STAR, Accelerated Math.
All of those have at least 20 alternate forms.
And if you remember from the link that I took you to either rit4success
or intensiveintervention one of the columns in those charts did indicate whether
or not there were alternate forms available.
So definitely something you want to look into if you have several times such as weekly
or even more frequently that you need to check a student's progress.
[ Pause ]
As we continue to move through our seven step process here step 4 of the process refers
to actually representing the data.
So we've gone through all of the decisions that we have to make, who and where and how often
and what type of data, all of those things.
And we've actually gone through the process of collecting all of this data.
Now, how do we best represent it so that people can make the most sense of it.
Because we collect all of these numbers, and we need to make sure that, again,
in our analysis we don't want it to be so cumbersome
that we can't get through it as a team.
And we want it to be represented in such a way that people can make sense out of it,
or they're not going to pay close attention to those things that are most important.
And in your representation you're also going to get a feel
for are we actually measuring what we intended.
Are we actually collecting the right data on the right skills that were set
out in the IEP goals and objectives?
So one way to do that, and one thing that Pattan has created is how
to create a graph for progress monitoring.
So, again, we won't take the time through this webinar to go through and show you how
to create those graphs and things like Diane was showing earlier.
But we will show you that there is a link under the publications tab on the Pattan website
with a document for how to create a graph for progress monitoring.
Well, [inaudible] documents [inaudible].
So here is that document a little bit larger for you.
So just step 1 set up the graph.
Step 2 establish a baseline, set up the goal.
So that's a really nice document to give you a little bit of guidelines.
[ Pause ]
Okay, so we're talking about step 5.
I apologize everybody.
So we're looking at step 5, and we were talking about the importance
of the analysis and the evaluation of the data.
So here are just some things, again, some guiding questions for what to look
for when we're evaluating the data.
So, first of all, is the student making progress toward the goals and objectives?
In answering this question consider the time line that was specified in the IEP
and the baseline or the student's starting point.
A basic rule of thumb, and you can see it here on the slide, in making those decisions
about whether or not a student is making progress is to look at the last six data points
on the graph of student progress.
If four of the last six points are below the aim line then it should be concluded
that the student is not making progress.
And then another question is how is the student responding to the intervention?
If the specially designed instruction and materials
and methods are appropriate then the student should be making progress.
If the student is not making progress then we should look at those areas for making changes,
looking at how the instruction is being delivered.
What materials are being used.
How are the specially designed instructions being implemented and provided for this student?
These questions that I'm presenting to you are really better answered by a team
which includes the parents, of course, so that progress can be evaluated
across multiple settings, again, looking at that transfer skills, that generalization skill.
So some example decision rules.
Let's say in this first example if the student's performance is below the aim line
on three consecutive days but it parallel to the aim line you might decide to wait to see
if the student performance accelerates in a level to reach the original aim line.
In this example we looked at data after three consecutive points.
And we looked at the trend and kind of asked is the aim line parallel to the line of progress.
And then we give an opportunity to continue our instructional process
as long as the lines are parallel.
We do this here because sometimes new instructional strategies take time
to show those results, and changing instructions approaches too quickly might actually be
detrimental to the student.
So we want to give it time.
At the same time if the student's performance is dramatically off
of our planned aim line this decision rule offers us the chance to make changes quickly
in our instructions plan to make sure that we know
that we're providing the student what they need.
If we continue another three data points with the same results then we do need
to consider making that instructional change.
You want to draw vertical line on the graph so that we see when there was a change
in the program, and then we can compare the data.
Again, another example of a decision rule if the student's performance is above the aim line
after three days it may be appropriate to raise the aim line.
Draw a line parallel to and above the aim line.
So the nature of the instructional change is not determined directly
from this type of monitoring.
The decision about what to change actually occurs from the clinical skills of the teacher
as well as the examination of the data being collected.
And then a collective sort of wisdom about what may or may not be the cause of no progress.
The specific changes can be small.
Maybe they're just related to motivation or the way that prompts are being delivered.
Or changes they could be more dramatic
such as actually making changes in the curricular content.
But note that the changes are not always about the student's academic skills.
They can equally and maybe more often even be about changing the instruction environment
in ways that maximize the student performance.
We tend to maybe move too quickly to skills as the deficiency and not to look constructively
to see if the teachers are maximizing the opportunities for the students to succeed.
So we want to make sure we look at our practices, how are we implementing,
how are we delivering instruction, are there specially designed instructions in place,
those kinds of things before we jump to skill deficits as the cause.
So here if we look at students who are not making adequate progress we're making data
So the data has been collected, and if we see increasing scores, right,
so here's our goal line, here's where we wanted the student to be.
And if we see increasing scores for the student's trend line and their data points fall
above that then the student is becoming a better, I don't know,
mathematician or a better reader.
If we see sort of flat scores, so here's the goal line and here's the student data points
and they just kind of have a flat trend, then the student really is not profiting
from the instruction and requires a change in the instructional program.
[ Pause ]
So looking at this graph of data is the student making progress in [inaudible] read correctly.
It looks like they are.
The trend is going up, right?
And it looks like Sarah is making progress in that skill.
Looking at this example is this student making progress?
Trend is kind of flat.
If we drew a line in here data points are all sort of even, falling right there in the 20
to 25, 30 range, not really making a whole lot of progress.
Looking at this data I would have to ask myself if I saw this why would I wait
to make a change in the instruction?
This is a whole year's worth of sort of flat results, right?
So really we should be looking back quite a ways into a couple months
of why isn't this student making progress.
And should we be making an instructional change?
Don't want to see flat data, flat progress across the year.
[ Pause ]
So, again, looking at Laura here's our goal line, right?
Here's our trend line.
Is she making progress?
So a teaching change is warranted here.
She's just kind of making flat, very minimal progress.
And her progress is less than the goal line,
so we would definitely want to make a teaching change there.
And here's a vertical line showing difference from the baseline to there's some sort
of instructional change was made.
And there may be another vertical line here showing another change
in instruction if that's the decision.
Here's his goal, but here's his actual trend, right?
Here are his data points.
They're falling above.
It might be appropriate here to raise the goal.
He is performing well above the goal that was set
so it might be warranted to change that goal.
We can also look at mastery.
And we can look at several things at one time.
So here's adding multi-digit numbers.
So, again, here's our goal, here's where we want kids to be.
And then here is the progress in the addition.
And once that was mastered here is the progress made in the subtraction.
And then maybe it's appropriate to move onto multiplication.
So, again, you can see the change
in instructional program there with the vertical lines, okay?
So the goal might be by year's end the student will increase performance by one grade level.
By October the student master multi-digit addition with regrouping.
Is that what happened?
It looks like it did.
By December 1st the student will master multi-digit subtraction.
Did that happen?
And then by January the student will master multiplication facts.
[ Pause ]
This is a report from Yearly Progress Pro.
This indicates student performance on the scale of recognizing addition
and subtraction as inverse operations.
And, again, this is just a report that's generated from this site.
And you can see on this example the student is making progress.
[ Pause ]
Okay. Here's an example of a functional skill or a specific skills as compared to a skill
that can be measured as a general outcome measure.
So here is this annual goal while seated in a wheelchair,
and given a direct verbal cue the student will reach for and actively grasp four
out of five objects within 20 seconds
of the visual presentation slightly below eye level on three consecutive trials.
So here is the baseline data that was collected.
The instructional program was put into place.
Here's our aim line.
And, again, you want that to be parallel the whole way across.
We want 80 percent for three consecutive trials.
So there's your 80 percent three consecutive trials.
And we're looking right now, she's kind of, you know --
here's a burst at 60 percent when it was first introduced, and then it kind of dropped back
to match the baseline at 40 percent.
And she's got not a consistent progress here.
So we're still reaching for.
So, again, consideration of.
And look at your time line down here.
We're looking at a month from April through May.
And we want to look at considerations for when should there be discussion
about making an instructional change, those kinds of things.
Once the data has been collected and evaluated then the need
for instructional adjustments must be considered.
So, again, thinking about is the student making progress and what does that progress look like.
If they're making steep progress and they're way
above the goal instructional adjustments might be warranted.
If they're not making progress and they're way below the goal, again,
we might need to make instructional adjustments.
Remembering that that might be in the goal.
It might not be made in what we're setting forth for the student.
But it might be in the way that we're delivering that instruction
and providing the support for a student.
So let's look in terms of instructional adjustments.
If the student is making progress then definitely celebrate that.
Keep doing what you're doing.
Focus on the progress that's being made.
And then remember to consider increasing those expectations if that's appropriate.
And don't forget to involve the parents.
Parents want to know when their kids are doing well so definitely keep those lines
of communication open with the parents and include them in the celebration.
[ Pause ]
Maybe the student is not making progress.
If they're not making progress then, again,
we might want to consider changes in the intervention strategies.
So, again, as I mentioned look at the specially designed instruction.
Is the intensity suited?
Is the duration, is the frequency, all of these kinds of things.
Looking at the instructional materials, looking at the instructional arrangements,
the groupings, the use of peers, the way that we're delivering the instruction.
What motivational strategies and reinforcement schedules are in place.
How much time is allocated for each of the lesson components,
things like that you want to consider.
[ Pause ]
Okay, and in our last step of our seven step approach we finally want
to consider how the progress will be communicated
to team members including, of course, definitely parents.
We want to make sure we have them involved the whole way through the process.
So if the student progress is only reported at the end of the marking period
without any opportunity to communicate
with the parent along the way then you've created an incomplete picture
of that student's progress.
We want to make sure that communication
about student progress actively involves this parent and,
when appropriate, the student as well.
When progress at school is sufficient simply a phone call home or including a note
in the student's planner lets the parent know that the student is progressing appropriately.
Maybe the parent can help explain how things are going on at home,
and if they're doing anything extra to make sure that that progress at school is sufficient.
You want to get the input from the parents so you know what's happening.
If progress is minimal or it's not occurring
as expected then the parent also should be contacted.
And you can ask some questions like what suggestions do they have to help the student?
What's working at home?
Is the student sharing information with the parent that can help the teacher?
Communicating progress sometimes is the missing step in progress monitoring.
It really comes before the reporting of the data.
So it really comes throughout the marking period, not just every nine weeks
when we report the data to parents.
And we want to make sure that all parties have that chance to dialogue about ways to increase
and continue the progress that's being made by the student.
How that progress is communicated should be determined by the IEP team
and should be noted on the student's IEP.
The team should develop strategies to communicate progress in a way that the parents
or the recipient will best understand and use that.
So ask parents what is the best way to communicate with you?
Is it through a phone call?
Is it through email?
Is it a note home?
Is it face-to-face.
But you discuss that with them and come up with the best system of communication.
[ Pause ]
Okay, so here's a sample of a progress monitoring report.
You know every nine weeks the parents will receive a report
of perhaps writing goals if this was for a writing goal.
Maybe sending bi-weekly writing prompts and giving a graph of the correct word sequence.
Analysis of the use of the style and writing prompts every two weeks.
Parents get a report of the PSSA writing scores over the summer.
That's another way that it might be reported.
[ Pause ]
As we conclude this slide gives you some additional resources that we came
across that really give some good description of progress monitoring and, again,
what types of data to collect and decisions to be made.
You can see the National Center on Response to Intervention and the National Center
on Intensive Intervention that were mentioned earlier.
Definitely the Pattan website.
Don't forget to look for those archived webinars
under the videos tab that Diane mentioned earlier.
Look for different publications under the resources tab.
And, as always, I will show you our contact information.
You can definitely email questions to both Diane and I,
and we will get back to you with answers for those.
So on behalf of Diane and I, I'd like to thank you for attending this overview
of progress monitoring and to invite you to attend the next webinar.
As a reminder of Progress Monitoring for Writing will be on November 26th.
And you can find details and registration information for that
on the Pattan website under the training calendar tab.
And just look for November 26th and the title Progress Monitoring