ISTEP Scores: It Depends on How you Look at Them
— Headlines in Indiana newspapers following the most recent release of state ISTEP+ scores
by Jeff Abbott, Ph.D., J.D. The release by the Indiana Department of Education of the latest 2008 ISTEP+ scores was met with widespread dissatisfaction — as is always the case. Statistical analysis, however, shows all the angst is unnecessary. The news reports do not accurately reflect the performance of Indiana students or schools.
Teachers and school-district leaders have fallen in the trap of only comparing this year’s scores with last year’s scores. Even a top state department official at the press conference when the test scores were released referred to this year’s results as “disappointing.” One superintendent said it “bothers (him) that scores don’t go up” each year. Another superintendent called a special meeting with his principals to “analyze and figure out exactly what went wrong.” In yet another school district the assistant superintendent was reported to be “frustrated” with her district’s scores.
The public does not understand the statistically valid way of analyzing ISTEP+ test scores. Indeed, most school leaders are on the defensive because they do not have the knowledge and skills to defend the quality of their education services. They have essentially conceded that all is doom and gloom in ISTEP+ Land.
To repeat, it is of little statistical use to compare only one year’s test results with the previous year’s results. It is a meaningless comparison, particularly given the fact that the reported scores are from different groups of students, and that the tests compared are not even the same tests. With small schools and small classes of students, scores will always fluctuate up and down because the students who are tested each year are different students and have different academic abilities.
Test scores must be compared instead over a period of years to determine the trend of the scores. This can be done using an Excel spreadsheet, plotting the scores, preparing a line graph, and then adding a trend line.
The resulting chart shows that ISTEP+ scores for Indiana students are not dropping over the long run but are increasing. The linear trend line is going up from left to right, which indicates a positive trend in scores. Moreover, by using a widely used quality-process tool called a “Control Chart” it can be shown that the scores are increasing because of something other than random chance. (An upcoming issue of the Indiana Policy Review will explore this finding in depth.)
Regardless, for 13 years now Indiana citizens have experienced the depression that is the annual release of ISTEP+ scores. Schools continue to be ranked much like the standings in the sports pages. Schools’ test scores are compared with other schools’ test scores. Schools’ results are compared with their last year’s scores. Seldom is there mention of that need to compare an individual school’s scores over a period of time.
Moreover, ISTEP+ tests have increased in difficulty over the years, while levels of student poverty have risen nearly every year and the number of non-English speaking students has grown. Indiana scores nonetheless have risen over these years, leaving us with this inescapable conclusion: Indiana’s teachers and principals are performing yeoman service in teaching the students of Indiana.
Rather than being applauded, however, educators are roundly criticized. The next time you see a teacher or principal, you might want to pat him or her on the back and say, “Thanks for a job well done.”Jeff Abbott, Ph.D., J.D., is a professor of education at Indiana University-Purdue University Fort Wayne and an adjunct scholar with the Indiana Policy Review Foundation. Contact him at email@example.com.