Why U.S. News Doesn't Get College Rankings Right

Barry Lenson

More Reasons Why U.S. News Doesn't Get College Rankings Right

If you still believe that the college rankings in U.S. News are a useful tool for selecting a college, I direct your attention to “The Order of Things,” an article by Malcolm Gladwell that appeared in the February 14 edition of The New Yorker. 

Over the years the U.S. News rankings have been widely criticized by many observers. (Heck, this blog has questioned them a few times.) But Gladwell’s article takes the criticism to new levels by pointing out the utterly senseless and arbitrary criteria that the U.S. News editors use to compare and rank colleges.

This article is something you must read. But just to give you a small taste, here are a few of the compelling points that Gladwell makes:

It’s completely illogical to rank completely different schools alongside each other on the same list. Penn State and Yeshiva are drastically different institutions, yet U.S. News puts them both on one master list and says that Penn State is the 47th best college in America, while Yeshiva is the 50th. Is one student really going to use these ratings to decide whether to skip Yeshiva and apply to Penn State? Of course not. The schools appeal to very different students. Penn State is a huge university with a football team. Yeshiva is a small school in New York that might not even have a football.

The criteria that U.S. News uses make very little sense and cannot be accurately measured. “Undergraduate academic reputation,” for example, is the most important factor that the editors consider when ranking schools. It counts for 22.5 percent of the total ranking that a college or university receives. And how does U.S. News assess each college’s reputation? It sends out questionnaires to university and college presidents and administrators and asks them to rate all the schools in their own category on a scale of one to five. This approach is laughably flawed, since there is no guarantee that the provost at a university in California knows anything at all about a similar school in Massachusetts – yet he or she gets to weigh in on the most important factor that U.S. News uses to rank colleges. Plus, there’s the chance that the results in this critical category could be skewed because the president of one college doesn’t like another college, or wants to drive its rating down.

Other rating criteria are similarly flawed. A category called “faculty resources,” for example, counts for 20 percent of a school’s ranking. Yet this critical benchmark is assessed by weighing class size, the degrees that faculty members have earned, and a fuzzy criterion called “student engagement” which is essentially unmeasureable.

Gladwell doesn’t put it this way, but if U.S. News’s systems were applied to ranking zoo animals, you’d get a list that said that the lion was your #1 top ranked animal, the giraffe was your #3, and maybe the kangaroo was #25. If people used those ratings the same way they use the college rankings from U.S. News, everybody would rush to see the lion and ignore all the other animals.

That wouldn’t make any sense, would it? And as Gladwell and other keen thinkers are telling us, neither do the college rankings in U.S. News.

Related Posts
Our College List of Lists
Businessweek Names 20 College that Aren’t Worth the $$$
Required Reading for 2010 High School Grads and their Parents
Information that Colleges Don’t Want You to Know
What's Wrong with the U.S. News Best College Listings?

Previous Post Next Post