Narrow!

I can’t help but think that if this new GAO report (pdf) had found more evidence of curriculum narrowing affecting the arts – instead of basically the opposite story – it would be getting, you know, wider attention than it is…At least they tried with the headline, but the key finding is: 

Most elementary school teachers-about 90 percent-reported that instruction time for arts education remained the same between school years 2004-2005 and 2006-2007. The percentage of teachers that reported that instruction time had stayed the same was similarly high across a range of school characteristics, irrespective of the schools’ percentage of low income or minority students or of students with limited English proficiency, or the schools’ improvement under NCLBA. Moreover, about 4 percent of teachers reported an increase. However, about 7 percent reported a decrease…

Read the whole thing to learn more about the seven percent, that does matter.   Also, buried in the GAO analysis in the text and a footnote is some direct evidence that skepticism of the Center on Education Policy data on curriculum narrowing was quite warranted…

In any event, “everyone knows” that arts are being cut, there is a race to the bottom, NCLB is killing field trips, etc..etc..etc…

22 Replies to “Narrow!”

  1. NCLB-related mandates and the cult of standardized testing is narrowing the curriculum overall, limiting to what is taught as only what is tested. Other than art, teaching to testable subjects limits instruction in many areas, notably science, social studies, and “specials” like art. Those of us that actually work in the schools, which includes gathering highly valuable qualitative and anecdotal evidence from teachers and administrators, paints a very dire picture of terms of school curriculum, ESPECIALLY at the elementary level.

    The most disheartening thing here is that with such an inspiring and transformational presidential election, few students are actually hearing about it because social studies is not taught. If it is taught, it is limited to testable knowledge only, such as names and dates of historical events. The really humorous thing going on in some schools right now is the scramble to teach science and social studies. Some limited research finds that content knowledge in science and social studies can benefit reading comprehension. For several years, teachers have not taught either of these subjects and are now running around trying to figure out how to get this content into their reading curriculum so that students can improve test scores.

    If only reformers would attend more to the qualitative aspects of the debate and I think you would find a pretty stunning portrayal of actual curriculum narrowing, a culture of audit, and a lack of curricular control by teachers.

  2. What Jeff said. Also, if the GAO study is based on teachers and schools estimates and self-evaluations, instead of actually studying the time spent it’s probably quite skewed. Not to put too fine a point on it, but the veracity value of such surveys is not terribly high.

    Even it’s it’s true (which I doubt) that “only” seven percent of schools reported loss of time on the arts, with a higher percentage in low performing schools (which the report showed), “only” is a loaded word. If a prescribed drug had adverse side effects in “only” seven percent of patients, with a slightly higher rate among poor and minority patients it would be subject to an immediate recall and the line of lawyers filing suit would “only” stretch for miles. It’s easy to dismiss these findings when it “only” happens in someone else’s school.

  3. Are you about to concede that the billions spent on NCLB “only” did a little harm?

  4. Maybe this is a silly question that I could answer by reading the whole report, but why is the comparison being made between 04-05 and 06-07, when NCLB was signed into law in January 2002?

    Also, I think Robert’s right about survey data (although I’d assume, a priori, that the skewing would be in the direction of overestimating the narrowing). I’m less convinced by John’s point, however, since I don’t think it’s totally obvious that a small decrease in the availability of art education would be “harmful”.

  5. Speaking purely from personal experience, Paul, the skewering is in the opposite direction of the “gotcha.” John Thompson, I’ll wager, will confirm this, but when there’s data being collected that an administrator fears will look bad, the impetus is to say what they assume the district wants to hear. So if the assumption is that it’ll be embarrassing to say “we’ve cut back on art” you don’t say it. I’ve since skimmed through this report and noticed that it also compares the 2004-2005 school year with 2006-2007 and blogged about this over at Core Knowledge. By that barometer, I’d have reported no change in the amount of time my kids got art and music. Why? They got squat in 2004-2005. They got the same squat two years later. The narrowing had already taken place by 04-05. Can’t help but wonder aboutt that.

  6. Much of the data this report is based on comes from the national longitudinal study of NCLB. The data is nationally representative, the survey anonymous, and the instructional time questions are just a small part of the survey. There was no reason for the responders to lie either way.

    As to the short time period, the survey asked school personnel (teacher, principals) about what happened in their school in “the past 2 years” in terms of changes of instructional time. This helps to ascertain that the data is more accurate than if the question asked for longer period of recollection. In fact, this was the main problem that made the CEP report in this topic last year completely unreliable — it asked a single district person (Title 1 coordinator, as I recall) about what their recollection was of instructional time change in the past 3-4 years (or more? I can’t recall now.) When one goes so much time in the past people forget, Title 1 coordinators change. Worse, if one or two schools in a district of 20 schools reduced music or PE, one or two added a weekly music class, and 16 didn’t change — guess what will that coordinator remember? Whatever made noise in the local press or at the board meeting. And nobody makes a noise when music is increased or unchanged. The CEP report was worthless, but well known interest enjoyed tooting it all over town. This survey asked the question in each school, and not once per district. The results are also in line with 2007 IES report (based on SASS) that noted minimal changes in instructional minutes for non math/reading subjects (2-4 minutes per day on the average, as I recall). As Andy says, nobody is going to write up THIS report; it was initiated as a fishing expedition by the Congress, and when no fish were hooked…

  7. Ze’ev, note two things:

    First, anonymity doesn’t guarantee lack of bias in the results. “Had no reason to lie” is not the same thing as “did not lie or provide false answers inadvertently”.

    Second, while focusing on the previous two years would provide more reliable results, this is irrelevant, since the previous two years are not what were interesting in the first place. What was interesting were the previous, say, six years. “Measuring this other, irrelevant variable was easier” is not much of a defense of the methodology.

    Again, though, I haven’t read the whole report, so it might be defensible on the whole. But the defenses you offer are not adequate.

  8. What Paul said. It galls me how often researchers are willing to settle for what’s measurable, rather than what’s meaningful.

    I know my testimonial doesn’t mean anything to some because they’re not quantifiable, but I can tell you that the curriculum has narrowed in the time that I’ve been teaching (1999-2008). Then again, I could not point to a decrease in art instruction at my school during that arbitrary two-year window, because there was no visual art instruction at all and music wasn’t cut until 2007-2008.

  9. Ze’ev,

    Thanks for making our point. Ordinarily I immediately read the hyperlinks but this report was too blurry for my bifocals.

    A 7% narrowing over two years of a six year experiment that was NCLB, disproportionately damaging the poor schools that the law was supposed to help? Over the entire tale of NCLB, then we might have seen narrowing in 15%, 1/4th or 1/3rd of schools? Then what would the headline be? (and was the narrowing equally within the schools, or disproportionately directed at the weaker readers?

    What if we had invested the NCLB billions in arts, music, PE, health, and community schools? What if we had used the arts as a path toward better literacy?

    Or put it another way, how much was the increase in blood pressure, obesity, blood sugar, etc. in poor students over the two years of a longer trend? How many billions will that cost us in MediCaid and other health costs?

    FYI. A quick and dirty counting of instructional minutes is the most inaccurate way to assess narrowing because of Block scheduling, and I’m unaware of a study which invested the time to do that properly. (and without consulting with teachers I can’t imagine a socila scientist not getting lost in the deatails. If you don’t understand why, ask a teacher.)

  10. Regarding ‘reason to lie’ I agree that anonymity is not a guarantee of anything. I was simply responding to those that argued teachers/admins will lie to make their school somehow look ‘better.’ If it is anonymous, their school will not look any different. If they still want to lie, they surely can. On any survey. That is all.

    Talking about the 7% accumulation over the years sounds like a semi-reasonable possibility until one realizes that those who argue it conveniently forget that there was also 4% INCREASE in parallel. Please, at least PRETEND that you are fair. As to how much happened since 2002, it would help those that throw up numbers like 15–or 33%–to recall that NCLB did not start to be felt at school levels until at least 2002-3 school year, and it really had no teeth until its demands started to bite couple of years later. In any case, that is why I brought the IES SASS-based results that showed the same thing–rather trivial changes in instruction time over much longer periods. Oh, I forgot–block scheduling? A nice red herring. The SASS report is for grades 1-4. Little block scheduling goes on in those grades.

    And Sam is right. His testimonial does not mean anything. The plural of anecdote is not data.

    Andy’s point of ignoring reports one doesn’t like repeats itself here very well. Re-read what John Thompson effectively says: “The data is not conclusive enough to eliminate even 33% reduction–I have no support for this claim, but prove me wrong; and in any case, imagine what the money could have done if applied elsewhere (pick your favorite: community building, obesity reduction, art support, SCHIP).”

    Please.

  11. Ze’ev, note that you’re conflating 2 different things. The question is, what are the merits of the GAO report? Your defenses of the report are inadequate, but you’ve confused yourself by bringing into the discussion the merits of some *other* report.

    (As for this other report – the one by IES – it’s not even clear which report you’re talking about. They put out a large enough quantity of reports that you need to be more specific.)

    But in any case, I neither know nor particularly care whether there’s been a small decrease in time for art education. But I know enough about educational research methods to know that the GAO report, as described, does not sound terribly reliable.

  12. I thought that I was quite clear why I find the GAO report generally credible: it relies on data collected from appropriate people with appropriate methodology.

    I also made two comparisons: with the CEP report on the same topic from a year ago (http://www.cep-dc.org/_data/n_0001/resources/live/07107%20Curriculum-WEB%20FINAL%207%2031%2007.pdf ) that I argue was badly done but got a lot of press because it claimed it found sharp narrowing of the curriculum, and with IES report from mid 2007 (NCES 2007-305 at http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2007305) that used SASS data and found no such narrowing, in line with the current GAO report.

    If you already made up your mind on the GAO report based on what you heard so far, I am sorry that I am confusing you with data. Same apology goes to others who feel that way.

  13. Ze’ev, I think the underlying issue is you don’t much understand good social science research methodology, since you apparently don’t appreciate 1) why survey data can be unreliable and 2) how to develop an experiment to test a hypothesis. 04-05 was not pre-NCLB. Therefore, it makes no sense to *treat* 04-05 as pre-NCLB, and this is true even if it’s easier for people to remember 04-05 than some other year.

    Note also that while you worry about how hard it would be for individuals to answer a survey accurately if they had to think back more than 2 years, you apparently haven’t thought about *any other reasons* why it might be hard for people to answer a survey accurately.

  14. Ze’ve

    Where did that come from?

    “Re-read what John Thompson effectively says: “The data is not conclusive enough to eliminate even 33% reduction–I have no support for this claim, but prove me wrong; and in any case, imagine what the money could have done if applied elsewhere (pick your favorite: community building, obesity reduction, art support, SCHIP).”

    No I won’t be a hypocrite. I think I know where it came from, my last pargaraph, the one with all of the typos. It also had the aggressive comment that I should have edited out.

    I apologize.

    I think I would have re-written that paragraph in a more constructive way. I just got interupted and posted too quickly.

    You also had a great response in that Block Scheduling doesn’t apply to elementary. It had applied in a previous study that I must have confused.

    As to the substance, the report studied 1/3rd of the time span of the law – so far. So, do you multiple 7%, or more or less, by three? So, when you are estimating the downside of NCLB, do you estimate a minimal ammount of damage due to this one factor or a significant amount of damage done to the kids that the law was designed to help? (When measuring the positive results of NCLB, the law’s supporters do the same, but your post was not reassuring in its arguments that the damage side of NCLB might be smaller.)

    My point was the opportunity costs of that expensive law have been huge, and it still seems to have caused more damage than a good law should caused. (the 3% growth could cut both ways, serving as a reminder that maybe we should have invested more in the arts in poor schools.

  15. Also, Ze’ev I thought I’d written 15% to 1/4th, which also was something I should have written as 21% or seven times three. The 33% was a complete typo mascarading as bombast, and it was in an early paragraph that should have been edited..

    sorry again

  16. John,

    I don’t know if we need to multiply the 7% by 3, or multiply 7-4=3% by 3, or raise e to the (-i*pi*7%) power. Arguments can be made for any one of them. For example:

    (a) 2 years is 2/7 of NCLB’s life. Let’s multiply 7 by 7/2. The 4% increase doesn’t count as it reflects a “good thing” and we want to focus only on bad things.

    (b) same argument applied a bit more fairly to 3% instead of 7% (7%-4%=3%)

    (c) 2004-5 to 2006-7 was probably the period of the most radical changes in curriculum due to NCLB. The law’s bite started to show, accountability systems started to fall in place in all states. Earlier there was little reason to change, and after 2007 reports started to complain about curriculum narrowing, so most schools started to shy away from taking away arts or social sciences. Hence the 3% is the upper bound of narrowing due to NCLB and it will probably decrease over time.

    Which one is right? We don’t know. However, the best data we have shows that (a) there was little long-term change until 2003-4 (NCES report) and that there seems to be limited change between 2004-5 and 2006-7 (GAO report). All the rest is speculation and hyperbole.

  17. Okay, Ze’ev, I glanced at both reports. Before I get into that, I want to say that it occurs to me that we cannot assess narrowing just based on minutes spent on the arts. If reading instruction is based mainly on phonics and devoid of literature, that’s a narrowing. It could still be the same number of minutes, though. If afternoon recess is cut to make room for test prep, that’s a narrowing. If we don’t have time to study the presidential election because the math test is coming up and the writing test is in the spring and ELA comes next, that’s a narrowing. As other commentators have said, this debate could benefit from some qualitative analysis.

    But still, let’s look at the reports. Here’s what you and Andrew did not say about the GAO report:

    “Elementary school teachers at schools identified as needing improvement, those at schools with higher percentages of minority students, and those at schools with higher percentages of students with limited English speaking skills, were significantly more likely to report a decrease in the amount of time spent on arts education compared with teachers at other schools.”

    At schools designated as needing improvement, 11% saw a decrease in those two years, as opposed to 3% who saw an increase. Schools that struggle to raise test scores feel the most impact of NCLB, and we can see that the impact is greatest on arts instruction in their schools. True, most of those schools said arts instruction stayed the same. But if we asked some more questions to really get at the issue of narrowing (what about recess? test prep? PE? Social Studies? lit?), I would expect that we’d find something different.

    And here’s what you did not say about the NCES report. This was the main finding from the summary page:

    “Findings from this report show that combined teacher instructional hours in first-through fourth-grade English, mathematics, social studies, and science increased between the 1987–88 and 2003–04 school years. This was due to individual increases in English and mathematics instruction. Over the same time period, instruction in science and social science saw an overall decrease.”

    Okay, so the report was a study of the four core subject. What does this have to do with arts education? Am I missing something? Should I go back and read more?

    It appears to me that the data from these two studies is not complete enough to really tell me much about narrowing in the curriculum. A researcher would need to define narrowing (might require talking to teachers) and then go beyond the numbers available here to really find out.

  18. After following this comment strand, I see that many here have come to some rather stunning revelations: that we may need to go beyond the numbers from government reports and actually talk to some teachers to see if curriculum narrowing is indeed taking place. If qualitative or mixed-methods approaches are actually utilized, it is amazing what can be found beneath the numbers. Lo and behold, curriculum narrowing is actually occurring and has been documented since NCLB’s inception and perhaps a bit earlier with the proliferation of standardized testing. Many states, for instance, and local school districts mandate 120 minutes for language arts instruction, just to start. This does not include mandated time for math and extra reading interventions. There is little time left to address social studies, science, and the specials, like art, music, and PE. Teachers understand this and communicate it to my colleagues and I on a regular basis. The problem I have is, however, why the teachers continue to swallow this bitter pill, considering that teachers are held accountable and under surveillance from a distance. Few policymakers and government officials like to actually mix it up with teachers in the field, which is why I’ve encouraged a lot of my cooperating teachers and even pre-service students to kick the mandates to the freaking curb.

  19. First, let me say that I am impressed that some of you have finally decided actually to read (I guess “glanced” is close enough) the report before offering your comments. Refreshing!

    Let me respond to the specifics first.

    Yes, the GAO report found that the reduction of arts in Title I schools is greater than in non-Title I schools. Would you expect it to be any different? Getting kids to learn how to read strikes me as more critical than teaching them to discuss arts. Title I schools show much larger failures in their ability to teach kids to read, and it seems reasonable that they–properly, IMO–focus on reading first.

    The NCES report indeed says what you quote. Perhaps, if you actually read it instead of just glancing, you would also note that the time dedicated to academic instruction increased over this period by 0.6 hours/week, accounting for much of the increase in reading and math. As I have repeatedly noted, the decreases in science and social science are rather small and come to about 3-4 minutes per day.

    Let’s now turn away from numbers and talk some speculation (“beyond numbers” and “qualitative” in Jeff’s parlance.) Yes, indeed. If one is permitted to define narrowing as one wishes, there is an absolute certainty that one will find whatever one wants to find. That why we have blogs and comments like this thread. But until someone comes forward and suggests exactly what to measure, and precisely how to measure it, we are just toying with ideas. Yet the outcry about ‘narrowing the curriculum’ came with very specific charges of large reductions across the nation in the arts, science, and social sciences (and, incidentally, PE.) Now that this has been essentially disproved, some seem eager to re-define the charges so they can be kept alive.

  20. Ze’ev,

    I do not know where you get your information, but curriculum has not been disproved. In fact, the charges have grown since NCLB’s inception. Now, you make the call that we should move beyond numbers, fine, let us do that. But in the same breath, you demand that we come up with some sort of measure. What kind of dashboard variable do you propose, man? These things cannot be reduced to mere minutes and seconds. Talk to educators, particularly at the elementary level, who contend that mandated minutes in math and language arts leave little time for anything else. This is all largely due to the fact that alternate curricula in science, social studies, and the like are simply not tested. If they’re not tested, then the schools, who rely on test scores as THE quintessential measure of their worth, don’t teach them.

    I see it as a very big coincidence that administrators encourage more time being spent on math and reading while these are the two subjects that are typically tested. Now, we’re not exactly talking about high quality time either. The math and reading that is actually taught is stripped down, rationalized, drilled, rehearsed, and repeated. Many schools still rely on basal readers. Heck, while we’re at it, why don’t we just go back to hornbooks? That seemed to work in the Latin grammar school days of jolly old England.

    You just can’t rely on these reports, the minutes are averages of averages, and these are also entirely self-reported measures. What teacher is going to admit that they NEVER teach science and social studies, especially those who have been around long enough to remember pre-NCLB days? Additionally, some teachers, not all, try to make up for a lack of science and social studies via integration of curriculum or cramming it in any old ten or 15 minutes they can find. All of these estimates upon estimates add up to what teachers report as their time spent on subjects other than math or reading. What kind of instructional time is this? Not the kind we should support and all of this is due to the culture of desperation we have found ourselves in as a result of punitive accountability.

  21. Ze’ev, you say that you don’t have a problem with narrowing the curriculum for Title I schools (i.e., poor kids), so why are we having this debate?

    Please don’t denigrate all qualitative research methods as “speculation.” Can you not see any place for them in this debate?

    BTW, effective teachers find ways to teach literacy across all content areas–not eliminating some while “focusing on reading first.” But high stakes tests seem to promote short-sighted curricula.

  22. I have heard that this was coming for quite some time. However, I did not think that we would get to this point. The narrowing of the curriculum has placed pressure on schools, teachers, and students to be proficient, but it all seems to be missing certain elements. I believe that curriculum and education is narrowing too much. It seems that we have forgotten about the jobs that will need to be filled in many different areas: from the arts all the way to include the area of manual labor. Not every person in the United States is an academic person, and I do not understand why we are trying to make it that way.

Leave a Reply

Your email address will not be published.