Ballardvale Research has posted this pdf entitled "Online Crisis Management: 30 Top Colleges/Universities Respond to Katrina." Though I have no doubt of the good intentions of the researchers, this strikes me as a prime example of how poor methodology yields irrelevant data.
The title of the piece is a bit misleading, because it isn't a study of 30 top schools ... rather, it is a study of what the researchers think are THE 30 top schools. And from whence cometh this data? Why, from US News and World Report! For those of you not in the academic world, let me explain to you that the US News & World Report rankings are generally considered a troublesome joke. They are a joke in that they are generally divorced from reality. For example, one of their top 100 schools in their 2006 rankings nearly lost its accreditation last year. Here's a simple formula: If you are on accreditation probation, you are not a top school. In fact, you are likely one of the worst schools in the country (or at least in your accrediting region).
We would laugh at the annual report, but the problem is that most parents naturally don't have the insight into the world of academics to know the good schools from the poor schools. Good students get pushed by their parents toward schools ranking high in the list. The end result is that schools fight to get near the top of the list while all the time knowing that the list is of little intrinsic value [I would link a bunch of articles in the Chronicle of Higher Ed here, but all the ones I want require a subscription. So, take my word for it ... this is a common perception among academics].
This study, then, uses the US News rankings uncritically ... something that no one involved in research should ever do. This poor choice is compounded by the regional bias of the rankings ... none of the schools are near the Hurricane affected area. The report then focuses on the efforts of schools that are unlikely to have a large number of students affected in any way.
The report singles out Ginnell, Harvey Mudd, and CalTech for "muted" responses, ignoring the obvious fact that these schools are in Iowa and California. The authors appear to sense that they are stretching, in that they write, "Caltech, with 19,000 living alumni (and presumably some living along the Gulf Coast)..." My own presumption is that the vast majority of alumni from a school in California are living on the West Coast, not the Gulf Coast. Let us assume, though, that fully half of Caltech's alumni live on the Gulf Coast. Except for students and academics, who out there thought, Gosh, that Hurricane Katrina is awful. You know, I should really check my alma mater's website to see what I can do? My guess is that most alumni would check in with the International Red Cross or some other relief organization. Heck, I'm an academic, and I didn't bother checking to see what my undergrad school in Indiana or my grad school in Michigan were doing. What reasonable person would bother?
Up until now, I've been criticizing this particular study, which is not my overall point. What I really want to demonstrate is that methodology is important, not some esoteric concern only for those involved in theoretical work. Because of poor methodology, the report is not wrong ... worse, it is irrelevant. A relevant report might be to study the responses of colleges and universities in Texas, Louisiana, Mississippi, Alabama, Georgia, and Florida, states which will actually have a high number of students affected. Unless I miss my academic geography, the Ballardvale report studies not a single school from the affected region. If they want to do work that is relevant, they should adjust their methodology. A report on relevant schools ... now THAT would be interesting.