Hot Data Action!

In today’s WaPo Jay Mathews picks up the trail laid down by Ed Week’s Sawchuk.  He invokes some pumpkin analogies that remind me of the Blueberry War of 2006.  And I’m pretty sure that in the pumpkin world results matter.   Anyway, my take is that we need to move forward on this issue but also be very careful because it’s not as simple as it sounds.

One step to moving forward is better data, new Data Quality Campaign report (pdf) looks at where we are on that.  Note that on linking teachers to students in state data systems:

Only 21 states have a teacher identifier system with the ability to match teachers to students; another 13 states plan to have this element by 2012, but 17 states report no plans to implement it. 

And that’s before you get to the political issues… 

2 Replies to “Hot Data Action!”

  1. Firstly, I’m glad that the report said that data systems are “an obtainable goal,” not something that are ready for use. Without prejudging whether NCLB-type accountability, value-added evaluations, etc. are a good or bad idea, I noticed a pattern. Those approaches to longitudianl data, that seek to advance accountability, are being implemented faster. Again, without prejudging whether they are more good than bad, they obviously have trade-offs and unintended effects that would need to be addressed.

    But what about data systems that simply help students, that are completely win win solutions? How much have we invested in those? There is no downside to tracking middle school achievement to see whether students are ready for high school, but only 12 states have adopted them. What about high school indicators to help kids prepare for college. Again, a technology wityh no down side has been adopted by only 10 states.

    I’m reminded of the proposal to track and reach out to elementary students with severe absenteeism. Again, that is simply a technology for helping children?

    Why can’t we show the same commitment to “win win” solutions, as we do for solutions that may produce good things for students, but which also may hurt students?

    Maybe the devotees of data-driven accountability are right and you have to pick fights and you can’t make an omelet without breaking eggs. But where are their priorities? Have many of them have gotten so caught up in fighting teachers and unions that they forgot what are the means and what are the ends?

  2. The Data Quality Campaign report overlooks one important data quality factor: whether the tests states administer cap student progress or whether they are “ceilingless” tests. Too many tests, under our NCLB, measure only whether students have achieved state proficiency standards and the extent to which they fall below it. If a bright or high-achieving child begins the school year at proficiency or even advanced proficiency levels (if such scores are available), those tests have no way of measuring whether that student has made any progress over the school year. While the school will be happy to report that student’s proficiency score at the end of the year, the score could mask the school’s failure to challenge the student or advance his or her learning at all. By requiring no measures of achievement once a child surpasses the proficiency or advanced proficincy levels, NCLB simply provides no incentive for schools to focus on ensuring that gifted students are challenged to reach their potential. By relying largely on these flawed diagnostic tools, the DQC factors also do little to improve instruction for gifted students.

Leave a Reply

Your email address will not be published.