Here’s the link to the study itself (pdf). From yesterday’s TIME column here is the sum takeaway on performance:
Of the 40 CMOs that were selected for inclusion in the study for various reasons, including having a minimum of four member schools, 22 networks had sufficient data for the student-achievement analysis, which looked at three years of middle-school performance. The study found that, in general, students at charter-network schools outperform similar students at traditional public schools, although sometimes not by very much. But that overall average masks an enormous variation among different CMOs. High-performing CMOs are so effective they are providing the equivalent of three years of schooling for students every two years. But CMOs at the low end are so bad they are effectively costing students a year of learning every two years. Bottom line: 10 of the 22 CMOs are outperforming their public-school peers in math and reading, in some cases substantially; eight are middling; and four are serious laggards.
If you follow the issue closely don’t miss the – cliche alert – treasure trove of descriptive information in this study about how CMOs are operating. A few pro and con commenters have opined to the effect that this data must be either “forged by public schools and teacher unions” or not valid because it doesn’t use a pure RCT or randomized model as, for instance, Caroline Hoxby does in her research on charter school effects. The methods are solid, learn about them yourself at the link above. The focus on middle schools stems from a data availability issue. It’s harder to do longitudinal studies for elementary school students because most states don’t assess in the early grades and at the high school level the assessment policies are very mixed, creating data issues there and necessitating the use of other measures – eg graduation, college-going, etc…in addition to this ongoing research effort the new Broad Prize for CMOs will also shed some light on those issues.
How you interpret the performance data probably has a lot to do with that you think about charters and CMOs in the first place. My take is two-fold. One, given where a lot of CMOs operate I’m not surprised by the quality issues. A problematic mix of poor quality charter authorizing and badly designed state policies create an environment where school replication is not always a function of quality. But, while we can certainly do better there, even with improved laws and better authorizing no one should expect 100 percent success. There is inherent risk in creating new entities. Worth noting that even the very good CMOs have some individual schools that struggle. In my view the question is how much risk are we willing to tolerate and given that we’re talking about schools, how much should we tolerate?
I always need to consult experts before speculating on the size of the impacts, but if this table is accurate, I’m not surprised. The report said,”Among CMOs, school-wide behavior policies and intensive coaching of new teachers are positively associated with student impacts in both math and reading.”
Gosh, can’t we all agree to support those two practices?
And the study also found, “At the CMO level, we do not find impacts to be associated with use of a uniform curriculum, extended instructional hours, frequent formative student assessment, or performance-based compensation.”
Gosh, shouldn’t we all agree to reject the top down curriculum “reforms” that are sapping the soul of education? I’m not a fan or an opponent of the other three. The study should call into question “reformers'” near-religious beliefs in those ineffective practices.
CMO= Cover My Orifices?
MEanwhile, back in the real world where Andy does not venture:
John … The Mathematica study was a study of CMOs, not a broad study of curriculum and instruction, so the results provide no broadly generalizable evidence concerning practices that you believe should be adopted or those that you believe are “sapping the soul of education.”
Really??!! Why is everyone surprised and fighting over this? Some CMO’s are good and really want to improve education for their students. The others… let’s just say that education as a business opportunity (edu-business) is the order of the day.
The bottom line is that it’s really up to the board of he authorizing district to decide whether money or quality of service is the motivating force behind allowing crap CMO’s to continue or not.
So you can say CMO’s are evil or CMO’s are great. But, the policy debate on this is really not at the level you think and with budgets being what they are….good luck.
The study also found, “At the CMO level, we do not find impacts to be associated with use of a uniform curriculum, extended instructional hours, frequent formative student assessment, or performance-based compensation.”
More on Imagine (and not the John Lennon kind):
Read more: http://www.stltoday.com/news/local/metro/7bc516eb-db20-5a8e-984a-d4fa7c7edf53.html#ixzz1d4qiqWRj
“a pure RCT or randomized model as, for instance, Caroline Hoxby does in her research on charter school effects.”
Or for instance, the much larger national study of charters schools done by Mathematica.
By the way. neither of these studies is an RCT. They are both lottery studies, which have a randomization device (lottery) but of course the lottery is non-binding so these should be considered natural experiments or good quasi-experiments if handled correctly.
The bottom line is that it’s really up to the board of he authorizing district to decide whether money or quality of service is the motivating force behind allowing crap CMO’s to continue or not.
So you can say CMO’s are evil or CMO’s are great. But, the policy debate on this is really not at the level you think and with budgets being what they are….good luck.