Reposted from Tn Edu-Independent- I wanted to share some interesting things from a recent policy brief I read from TeachPlus. The brief can be found here: The Student and the Stopwatch
This
research brief looks at how much time students are actually spending on
testing. We hear all sorts of things on this topic: we test all the
time, we test students to death, etc.
When we look at actual data on the topic, it turns out that we don't test that much.
"Across
12 urban districts, the average amount of time students spend on state
and district tests equals 1.7 percent of the school year in third and
seventh grades and substantially less in kindergarten."
1.7 percent of the year.
Not that much. One of the 12 urban districts they looked at was Shelby
County (not MNPS), and they were at the lower end of testing hours. So
given we're in the same state, with the same set of state assessments,
we're likely not far off from that same amount of time in Nashville:
Certainly,
there is the question of interim assessments done by schools and
districts throughout the year. But this study also factored a number of
those tests in their findings.
" Teachers were asked to report the precise state- and district-mandated tests they administer and the precise number of minutes required by that test."
"The kinds of assessments profiled in this study as state- or district-mandated generally fall into two categories. The first category is end-of-year summative assessments largely required by the states, such as the Massachusetts Comprehensive Assessment System (MCAS) or the Tennessee Comprehensive Assessment Program (T-CAP). These state-required summative assessments are often supplemented with district-required assessments that are used for formative or benchmark purposes. These can be administered infrequently, two to three times a year, or with greater regularity, as often as once every two weeks. Many of the districts included in this study have adopted assessment systems purchased from national providers, such as the Achievement Network (ANet), the Dynamic
Indicators of Basic Early Literacy Skills (DIBELS), or the Northwest Evaluation Association (NWEA)."
"The kinds of assessments profiled in this study as state- or district-mandated generally fall into two categories. The first category is end-of-year summative assessments largely required by the states, such as the Massachusetts Comprehensive Assessment System (MCAS) or the Tennessee Comprehensive Assessment Program (T-CAP). These state-required summative assessments are often supplemented with district-required assessments that are used for formative or benchmark purposes. These can be administered infrequently, two to three times a year, or with greater regularity, as often as once every two weeks. Many of the districts included in this study have adopted assessment systems purchased from national providers, such as the Achievement Network (ANet), the Dynamic
Indicators of Basic Early Literacy Skills (DIBELS), or the Northwest Evaluation Association (NWEA)."
One
thing that's worth asking locally: Are the internal assessments that we
have in the district throughout the year actually aligned with the
state assessments? Are they telling teachers and school leaders accurate
information to inform learning cycles, and doing so in a timely way?
You
don't use a coffee cup to take a kid's temperature. Having internal
assessments throughout the year that are supposed to tell you if your
students are on the right learning trajectory or not doesn't make sense
if those assessments aren't aligned to content standards and the state
assessment (which incidentally, should also be aligned to the standards.
I fully recognize districts in Tennessee are really in some form of
Purgatory right now with Common Core standards for math and ELA but the
old TCAP test - a misaligned assessment)
Some other things this TeachPlus research found:
- The variation in test time across urban districts is large, with high-test districts spending 3.3x as much time-on-testing as low-test districts.
- Urban districts spend, on average, more time than their suburban counterparts on testing. Suburban districts in this study average 1.3 percent or less of the school year on testing.
- Teachers calculate test administration time to be more than double the length reported in district calendars in elementary grades.
The testing debate certainly gets a lot of noise. As the TeachPlus brief finds:
"The
debate over whether there is too much or too little testing occupies a
prominent place in the policy discourse and in the media. However, the
debate is largely ideological and devoid, ironically, of data on the
amount of time students spend on testing."
If
we're going to focus our energies on improving teaching and learning
for all students in the district, one thing to spend our time on is to
make sure we're prudent with the instruments we have to gauge student
learning and learning progress. Those instruments are really important
to help inform the learning cycle and give us meaningful data to inform
instruction and professional development.
Top Stories
No comments:
Post a Comment