Rankings Redux

Couple of columns about the US News Best High School rankings recently. First, NYT’s Sam Freedman is shocked, shocked!, that US News seems to want to make money by selling magazines. I’m not sure it’s breaking news that US News isn’t a charity and is in the money-making business (and last time I looked the NYT is a publicly traded company). Isn’t the more important question whether or not the rankings recognize and advance some of the things we want schools to do? Freedman notes that the rankings do that. That debate, rather whether US News is making money seems to more important issue.

Jay Mathews addresses the rankings quality question in his weekly WaPo column. Though he calls on a few anti-testing types (surprise, Alfie Kohn doesn’t like high school rankings!) it’s a fun discussion because there is no single way to do a ranking like this; it all depends on what you choose to value and measure and what you can actually measure in an interstate way.

But, a couple of points raised by Mathews or people he cites in the column warrant a quick response. Most notably, some of the comments in the column as well as other things that have been written about this reinforce the notion that people should read the document about the methodology (pdf).

First, the US News ranking does not base its ranking on state test scores, which are not directly comparable. Instead, it uses a school’s overall performance as well as the performance of disadvantaged students as two screens to filter out schools that might be doing a good job giving some students access to AP but are not providing a strong education across the entire student body. That’s why many schools that do well on the Challenge Index fall of the US News list.

Remember that when Sara Mead and I took a look at the performance of the top 100 Challenge Index schools on their state assessments we found average black – white gaps in pass rates of 26 points in reading and 31 points in math. One school had a 61 point gap in math! More on all that here.

The reason is straightforward, while neither the Challenge Index or the US News method can account for graduation rates –which are still not calculated in a uniform fashion – schools can rank highly on the Challenge Index even if most students in the school are not being well served. The two US News screens for school performance help address this.

But then US News ranks schools based on AP tests as a college prep measure. This is why I think it’s an improvement over what the Challenge Index does and the Newsweek list: It still measures college prep but it knocks some bad actors off this list. The difference is that Mathews uses AP and IB and uses test taking, US News uses just AP and uses tests passed.

Where Jay and I disagree is that he doesn’t mind the presence of some pretty inequitable schools on the Challenge Index. His view is that offering the AP courses is so important that it should be rewarded, even if other measures of school performance are poor. I think AP course-taking is important, too, just not so much that overall educational equity and quality should be subsumed to it. And, given the performance data we have today, it’s not an either/or choice. A ranking can look at both.

Second, one person in Jay’s column makes the point that comparing selective public schools with other public schools is like statistically comparing the Baltimore Orioles to the American League All-Star Team. Sure, but here’s the thing: The All-Star team is statistically better! US News made a decision to include these schools because they are good schools and I think it’s less interesting that 19 made the top 100 list than that 81 other, open-admission, public schools did as well! That’s the story. Being a selective school is not by itself enough to get you on this list, many did not make it, but the ones that did are pretty good. And the other schools that rank up there with them should be especially proud.

The second judgment call that was made is to include schools even if they don’t have disadvantaged kids at all, or enough to show up in the data. Langley High in Fairfax County is a good example of this; the “poor” kids at that school drive SUVs rather than BMW’s. But, and you can argue this either way, is it fair to penalize the educators Langley – which is a very good school – because of the demographics of the community they work in? And, don’t we want all schools to perform at a high level, regardless of demographics?

In other words, both lists “err” in one direction or another. US News “errs” by including schools that have favorable demographics, Newsweek errs by including schools that aren’t very good except for offering a lot of AP and IB courses to a narrow band of students. I think the US News approach is a lot more defensible. The schools that make it on that list as a result of these two judgment calls are still very good high schools. On the other hand, here’s the data on two schools on Newsweek’s list this year:

…Eastside High School in Gainesville, Fla., 17th on this year’s list. Eastside did not make “adequate yearly progress” or AYP under the federal No Child Left Behind Act in 2006, although only about half of its students would have had to pass Florida’s math and reading tests to meet such performance targets. Eastside also received a “D” on Florida’s state school accountability system. The achievement gap between Eastside’s white and black students is enormous: 80 percent of white students are proficient in reading, but only 13 percent of black students. In math, 89 percent of white students, but only 15 percent of black students, are proficient. The achievement gap for economically disadvantaged students is just as big. And Eastside’s black and disadvantaged students perform well below state averages for their peers.

Or, take Southside High School in Greenville, S.C., 76th on this year’s list. Southside did not make AYP in the 2006 school year and is rated “unsatisfactory” under South Carolina’s state accountability system. The majority of Southside’s students are low-income and minority students who tend to do worse on state tests. But according to South Carolina’s state school report card, Southside’s students do worse than students in demographically similar schools on state end-of-course tests in math and English. And it graduates only 59 percent of students, also worse than state averages.

If I were advising Newsweek I’d either figure out a new way to characterize the list of schools the Challenge Index generates (Jay’s concern with AP/IB access is an important one) or figure out some screen to apply here as well to address the school quality problem that plagues the list. As the Challenge Index celebrates its ten year anniversary it’s worth noting just how far education data has come during that period as well. We can measure more today. So the problem with the Challenge Index isn’t that it measures the wrong thing, only that it can’t be the only thing on a list purporting to show the nation’s best high schools.

2 Replies to “Rankings Redux”

  1. My concern is the seeming overemphasis on the achievement gap in the U.S. News rankings. At least in Mo., Mathews’ concern about the ranking leaving out schools with minority populations is true in my state–no St. Louis county schools even made silver or bronze. The only schools on the list are rural homogeneous districts. I’m assuming the way the voluntary transfer program is set up affects the numbers. Are other cities affected similarly?

Leave a Reply

Your email address will not be published.