News

U.S. News & World Report College Rankings Released

by Lane Florsheim

Johns Hopkins University is better than Brown. WashU trumps Vanderbilt. USC is a superior institution of higher learning than UNC. Ah, yes, the annual U.S. News & World Report College Rankings came out Tuesday. Cue the groans.

What's so wrong with the list and the frenzy it creates? Quite a lot. For starters, as the Atlantic explains, the rankings actually make the cost of college — already quite costly — even more expensive, because the formula they use to calculate their rankings benefits colleges and universities that spend more money. So, schools do exactly that, and then have to raise their tuition to cover the questionably necessary spending.

What's more is that the publication's formula changes from year to year, so there's no standard by which to make meaningful comparisons from one year to the next. Critics maintain the metrics change each year not so much as a refinement technique, but rather as a marketing move, since changes in metrics lead to changes in the rank orders, causing people to buy the latest rankings to see where their alma mater or dream schools ended up.

The ever-changing formula continuously fails to account both for measures of the quality of education at each school (seems important), and also for what the Atlantic terms "outcomes." Outcomes refer to aspects of higher education, like whether students get jobs upon graduation. Seems like information worth knowing.

Plus, as Bullett 's Luke O'Neill points out, it's ridiculous that many of us base such an important, life-changing decision (even partially) on an impersonal ranking system whose methodology is not entirely clear.

Here's an eloquent explanation Reed College President Colin Diver wrote for the Atlantic about why the school withholds its participation in the rankings.

For ten years Reed has declined to fill out the annual peer evaluations and statistical surveys that U.S. News uses to compile its rankings ... One-size-fits-all ranking schemes undermine the institutional diversity that characterizes American higher education. The urge to improve one's ranking creates an irresistible pressure toward homogeneity, and schools that, like Reed, strive to be different are almost inevitably penalized.
... By far the most important consequence of sitting out the rankings game ... is the freedom to pursue our own educational philosophy, not that of some newsmagazine. Consider, for example, the relative importance of standardized tests. The SAT or ACT scores of entering freshmen make up half of the important "student selectivity" score in the U.S. News formula ... We have found that high school performance (which we measure by a complex formula that weighs GPA, class rank, quality and difficulty of courses, quality of the high school, counselor evaluation, and so forth) is a much better predictor of performance at Reed. Likewise, we have found that the quality of a student's application essay and other "soft variables," such as character, involvement, and intellectual curiosity, are just as important as the "hard variables" that provide the sole basis for the U.S. News rankings. We are free to admit the students we think will thrive at Reed and contribute to its intellectual atmosphere, rather than those we think will elevate our standing on U.S. News's list.