AERA’ed, Plus Blame Google!

You can’t trust Alexander Russo to report on a school bake sale and give an accurate account of the price of brownies. Contrary to his characterization (and here is a more accurate account from the moderator), the two primary points I made on the issue of research and policy work at an AERA panel yesterday are that:

There is a difference between social science and policymaking and journalism in terms of the process for developing and using knowledge and that’s a healthy distinction. Basically it boils down to approach to error and timeliness. Good social scientists favor the non-finding over the finding but policymakers and journalists necessarily act in more time-bound ways.* That doesn’t mean that research shouldn’t inform policymaking or journalism, I obviously think it should. But it does mean that researchers cross lines when they start connecting dots beyond what their research shows. Russo seems completely ignorant of the criticism that researchers as diverse as, for instance, Linda Darling-Hammond, David Berliner, Paul Peterson, and Jay Greene have received at various points because of exactly this. Of course, connecting those dots isn’t inherently bad, it’s part of what analysts, journalists, and policymakers do all the time, but it’s distinct from social science and the lines get blurry pretty fast. And, the incentive and reward structures are quite different inside and outside of academia. None of that is to say researchers shouldn’t be involved or that reporters shouldn’t call them (I refer reporters to researchers frequently and I made those remarks yesterday in the context of giving an audience of researchers ideas about how to work with the press) but rather that the issues here are not cut and dry at all.

Second, I again made the point that rather than seeking superficial cues about what is good research and what not, people should become informed enough to make those judgments for themselves – especially people working professionally in this field. As I said yesterday at the panel, Alexander’s characterization of research from traditional academic outlets as somehow more reliable than work from think tanks is a remarkably unsophisticated view of the production and consumption of intellectual information. That is not the same thing as saying that all organizations are alike, a point that didn’t come up and I didn’t make. As I note in the article linked below, the search for superficial cues, for instance government funding, can lead to bad choices about the explanatory leverage of different pieces of work.

As opposed to Russo’s account, I don’t recall Brookings coming up in the discussion at all although there are obviously differences and similarities between what they do and what organizations like Ed Sector do. Still, what’s interesting there in terms of this issue is that one of the most important pieces of education work to come from Brookings in the past few years was a paper co-authored by Tom Kane. He’s a Harvard professor but also works with the Gates Foundation and Joel Klein in New York City. The piece made an advocacy case about public policies (pdf). Russo’s simplistic typology fails when applied to real world examples like this. Urban Institute did come up in the context of the new TFA study and showed the folly of this entire think tank good/bad line of argument in two ways. First, one of the big papers being presented at AERA was an Urban Institute study. Second, if Urban is OK but Ed Sector is not to be trusted then what about ES people who work with UI people and those projects and papers? Or what about UI people, or other academics, who advise ES? Again, the simplistic approach provides no clarity for consumers. This is why it’s much better to take the radical step of judging the actual work itself rather than seek shortcut cues.

What was unfortunate for the attendees at yesterday’s session, most of whom were researchers, is that rather than more discussion about how effectively to work with journalists and strategies for getting their work into the public debate (if they want to and in light of the issues above) the session got tied up in Russo’s theory that reporters shouldn’t trust think tank people, which is quite obviously rooted in personal grudges rather than actual evidence. He mentioned Mike Petrilli but offered no examples of where Mike hasn’t been honest and likewise hasn’t offered any evidence of the same around Ed Sector’s work. Meanwhile, it’s especially ironic that the same day Russo is on a panel grumbling that reporters call think tanks too much, a new study was making the rounds about the decline in think tank citation overall. Its title: “The Incredible Shrinking Think Tank.” Anyway, in my experience most reporters don’t need lectures about who to call for sourcing and how to do their job, and the current and former reporters on yesterday’s panel basically made the same point.

This article from January discusses all this media-social science interaction in a lot more detail. When grounded in actual evidence this is an interesting conversation and an important one as methods of disseminating and consuming information change. Two very good books on all this are Rick Hess’ “When Research Matters” and Jeff Henig’s “Spin Cycle”, I’d highly recommend both if you’re interested in the issues.

Related to all this, one point I didn’t make on the panel but wish I had is that in a Google-driven world academics must find ways to make their work more accessible on the web. The journals I read I still mostly read hard-copy, so I’m part of the problem. But, one reason some academic research flies below the radar is that it’s published in journals that live behind firewalls and most people don’t have JSTOR accounts or other access. Just as newspapers and magazines are wrestling with the dilemma of how much information to put out for free and what to charge for, academic journals need to have the same conversation if they want to stay relevant in today’s digitally driven content environment. Google Scholar shows the amount of work that is out there but most if it is subscription-based you can see it but can’t get to it. I see this problem in my own work on this blog. Although not exclusively, I’m definitely more likely to write about something I can link to for readers so they can get original content. Links that go nowhere just frustrate readers.

*A second, related, issue that I mentioned on the panel but should mention here is the news cycle itself. Researchers delve deeply into an issue, for a while, and build knowledge. The news cycle is more dynamic and issues move on or off the agenda for a variety of reasons. That means a researcher can be doing great work, adding a lot of value to the literature in their field, but not doing work that’s “newsy” or of particular interest to reporters or policymakers whose time horizons and constraints are more immediate.

2 Replies to “AERA’ed, Plus Blame Google!”

  1. “in a Google-driven world academics must find ways to make their work more accessible on the web.”

    AMEN

  2. Andy, John Willinsky, new of Stanford, is a strong advocate of open access to research on the web. He gave several illuminating talks at AERA and advised scholars about ways to make their own work more visible and accessible. Visit the Public Knowledge Project to see what he and his colleagues are up to. http://pkp.sfu.ca/

Leave a Reply

Your email address will not be published.