Value-add measures for teachers are complicated. Two takes fresh out today. Shorter versions:
From the teachers’ union-funded EPI (pdf): We don’t want to say don’t use value-add, but use it only a very wee little bit! We’re more bullish on peer review, but ignore the evidence there please!
From U of W’s Dan Goldhaber: Use it responsibility and beware of the limitations. Why on earth is the LAT doing what it’s doing?
Goldhaber’s take is sensible. EPI is right that the fetishising of 51 percent of evaluation from value-add isn’t wise (and it’s also not practical as a comprehensive tool). And they sensibly call for a federal push to innovate with various evaluation models. But isn’t that what’s happening under Race to the Top and related initiatives?* And since we really don’t know what works here yet there is nothing wrong with states innovating with heavy value-add models (meaning weighted at 50 percent or more), too, is there? Besides, it’s worth nothing that models that use value-add for much less than 50 get attacked, too.
In fact, I’d argue the underlying issue is less the specifications of any value-add model, or any evaluation system that uses value-add, and more the underlying issue of outcome-based evaluation. Most of the debate today is camouflage for that.
*Take for instance the DC IMPACT model, which is a pretty good tool.
Hi Chris,
Thanks for your response.
I just figured that anything we were to find and bring back to you – you would take issue with making a comparison to TFA.
So, I was trying to avoid that middle step.
And maybe help everyone in the thread see that some real comparison could be made to TFA.
You know, to stop all the TFA bashing around here.
Thanks anyway, though.
Steve F.:
Interesting dilemma. Do I wait for someone to bring up another teacher prep program that has amazing numbers, or do I painstakingly search myself for such a program?
In either case:
1) Whatever data analysis I present will continue to be ignored.
2) The initial topics brought up in this thread will continue to be ignored.
3) I will continue to be vilified.
4) There will continue to be resentment of the big TFA conspiracy.
I also like that you’re putting the onus on *me* to stop the TFA bashing that goes on here, rather than on the lazy commenters that continue to frequent these discussions.
And, no, thank you for the response.
Why would TFA people have to be put in a different program? Using state data, one can track TFA grads as long as they stay in teaching. The TFA designation does not automatically disappear. SO UTEACH has a much, much higher retention rate. Why can;t you compare the two? UTEACH provides initial mentoring and support, but so odes TFA. UTEACH tries to place their grads in the same schools so they can provide support to each other. TFA does the same thing. Yet, almost no TFA teachers make it past three years, while a huge chunk of UTEACH grads do. So which programs affects students more? UTEACH of course!
Its really pretty simply.
And your link did not work.
You seem mighty angry that UTEACH has a much better track record on retention than TFA even tho both are elite programs.
Fortunately, the business community is funding replication oof UTEACH across the country. Maybe they will put TFA out of a job. One can only hope–for the kids’ sakes.
Billy Bob:
You again ignored all 7 critiques. That is quite a feat.
1) Yes, many UTEACH students do not teach. Thank god they figured out they shouldn’t be teachers before they actually screwed up some classrooms full of kids. And I would venture to guess that not everyone who enters TFA eventually teaches either.
My fav comment:”exposing them to ed coursework and student teaching.” Hmmm–you mean they PREPARE their teachers? The horror!! Who do they think they are actually PREPARING PEOPLE TO TEACH.
And a fair percentage of UTEACH grads teach in high-poverty schools. And they stay in teaching despite that. Many teach in Austin, Houston, and Dallas. TFA teachers teach in Houston.
By the way, an internal Houston ISD memo found that TFA teachers were more likely than other teachers to be ineffective under their VAM model.
In a southeastern state, teachers from traditional prep programs that tiaught in the VERY SAME schools as TFA teachers were far, far more likely to stay for a 3rd, 4th, 5th, and 6th year. Same schools. Can’t compare TFA in Texas–TFA does not want its teachers identified in the state data system.
UTEACH has multiple cohorts of teachers now and the retention rate hovers between 75% and 80%.
The validity of the TFA teachers is in question as well, especially when you take into the negative effect on students of greater attrition.
Face it–TFA and UTEACH can be compared and TFA is no better and may be worse.
And at least UTEACH uses REAL data to calculate retention, not some survey with a 62% response rate. Error rate on UTEACH retention–0%. Error on TFA study–unknown.
Billy Bob:
It’s like you’re reading something other than my arguments when you respond to me. Is it a good read, whatever it is?
“Yes, many UTEACH students do not teach. Thank god they figured out they shouldn’t be teachers before they actually screwed up some classrooms full of kids. And I would venture to guess that not everyone who enters TFA eventually teaches either.”
You really don’t get it. You are trying to compare retention rates of two very different programs. It’s not even about a vague sentiment of “some don’t teach”; there is an accumulating percentage of attrition that continues throughout UTeach. And this is not just from people who decide “hmm, this coursework isn’t right for me.” The numbers I cited were loss of students *AFTER* they started student teaching. The official teacher retention percentages offered by UTeach do not include any of the many students that decide, *AFTER* teaching, that they don’t want to teach anymore. The program is designed that way, which is not to fault the program itself, but rather careless attempts at comparisons to different data sets.
In contrast, TFA teachers commit to at least two years of teaching without this accumulating attrition before they began their careers. Whatever number is reported for attrition rates of TFA teachers encompasses the career decisions made of the ENTIRE cohort of TFA recruits; with UTeach, their attrition rate encompasses ONLY THOSE that have already taught and decided they wanted to stay in teaching. This latter number is an estimate of teacher attrition from a group of teachers that have already decided they want to remain in teaching.
“And a fair percentage of UTEACH grads teach in high-poverty schools. And they stay in teaching despite that. Many teach in Austin, Houston, and Dallas. TFA teachers teach in Houston. ”
Can you cite that “fair percentage” of grads teaching in high-poverty schools? Is it 100%? If it’s not, then my #2 critique still stands.
“By the way, an internal Houston ISD memo found that TFA teachers were more likely than other teachers to be ineffective under their VAM model.”
Yup, because an internal memo is a much more reliable metric of teacher effectiveness than all of the research with sound methodologies, right? (http://www.teachforamerica.org/about/research.htm#card )
“In a southeastern state, teachers from traditional prep programs that tiaught in the VERY SAME schools as TFA teachers were far, far more likely to stay for a 3rd, 4th, 5th, and 6th year. Same schools. Can’t compare TFA in Texas–TFA does not want its teachers identified in the state data system.”
It seems you are citing data from your unpublished dissertation again, am I right?
“The validity of the TFA teachers is in question as well, especially when you take into the negative effect on students of greater attrition.”
This also seems to cite evidence from your unfinished manuscript. Would you care to explain how the student achievement of a 1st grader is affected retroactively by his teacher leaving the profession 2 years later?
“And at least UTEACH uses REAL data to calculate retention, not some survey with a 62% response rate.”
Surveys that receive the same response rate as other TFA surveys don’t give real data? But data from a sample size of 66 teachers *is* real data? We shouldn’t trust studies that haven’t addressed the biases of an angry anonymous researcher on Eduwonk? But we should trust a study that said researcher hasn’t finished yet?
Look. I’ll let you try again. Here’s the link to my critiques (https://www.eduwonk.com/2010/08/adding-value-2.html/comment-page-2#comment-209225 ). Read it. Respond to it. There’s plenty of other counterarguments of mine that you similarly let fall to the wayside since you didn’t want to address them. I can link to them as a reminder if you’d like.
And while my first comment patiently awaits moderation, I’m wondering if you were trying to refer to Hammond’s 2005 study? And if so, link: (http://eduwonk.net/2009/08/if-the-race-to-the-top-were-the-olympics.html#comment-94394 )
OK Chrius–you convinced me. TFA is the BEST teacher preparation program ever invented. The teachers are extraordinary individuals and elicit great gains from the poor minority kids who are evr so thankful to be taught by privileged white teachers from Ivy-league schools. And the teachers from TFA are so committed that the majority stay much past their two-year commitment because they want to teach in these schools for extending periods of time. They don;t want to pad their resume, but really, really care about these kids and want to stay in the community for long periods of time. If we could only prepare a million TFA teachers, we could close the achievement gap, be number 1 in the world in every subject, solve our financial crisis, and everyone would live happily ever after.
Keep dreamin buddy.
Billy Bob:
I didn’t intend to convince you of anything. My only intention is to make sure readers knew that you, as several others here, deal only in deception and prejudice, rather than good-faith arguments.
On that note, I’m greatly looking forward to reading your coming paper.