Colleges Falsify Data to Boost their Ratings in U.S. News
Over the last few days, a number of newspapers have been reporting that an administrator at California’s prestigious Claremont McKenna College has just been fired for falsifying data that the school has submitted to U.S. News for its college rankings. Since 2005, the administrator in question had been adding points to the SAT scores of incoming students that he reported, as a way to boost his school’s position in U.S. News college rankings. (The average SAT scores of incoming students are one of the key indicators that the magazine uses to rate schools’ competitiveness.) And the strategy apparently worked, since the college – which is actually excellent – has worked its way up and is now listed as the ninth best liberal arts college in America in the U.S. News rankings. (We do not have a way of determining whether Claremont McKenna would have reached that same rank if its official had not been cooking data. But the point is, cheating was taking place.)
Well, call us not surprised. What is remarkable is the fact that a story like that would actually surprise anyone. Because you see, the methodology that U.S. News has developed to collect data is deeply flawed, and has been since the magazine starting compiling its college rankings. The problem is that the data that U.S. News uses to rank colleges is submitted by the colleges themselves.
Granted, U.S. News has a staff that is charged with the task of reviewing data to be sure that it seems accurate. But can any staff of editors really review and verify such vast amounts of data from hundreds of American colleges and universities? As we have just learned, those checkers just missed the fact that Claremont McKenna had been submitting inflated SAT grade scores for five years.
U.S. News’s system of data collection makes it terribly easy for colleges to slant and distort the data that they submit. It seems to happen all the time. A few years ago for example, The University of Southern California submitted false information about the number of faculty who were members of the National Academy of Engineering. It was an effort to help USC boost its rank among engineering graduate schools. And in 2009, an internal audit at the University of Florida found that its business school had been submitting false data about alumni job placement success.
But let’s face it. The motivation to gain a higher U.S. News ranking is so great that distorting data has become pernicious and pervasive in American colleges. Here are more dishonest practices that seem to have taken hold . . .
- Playing with admission rates. Colleges attract as many applicants as they can by mailing to thousands of high school students – then reject most of them. It’s a way of making colleges appear more selective in the U.S. News statistics. And there is nothing that the magazine can do to control it.
- Implementing strategies to report higher SAT scores. One strategy is to make standardized tests optional. A Bloomberg article explains, “. . . students who don't submit their SAT scores test between 100 and 150 points lower than those who do report their results. Keeping those scores out of a college's average makes it look better.” And according to a recent article in The New York Times, Baylor University offered financial rewards in 2008 to admitted students to retake the SAT so the school could report higher scores to U.S. News.
- Overlooking the fact that misreporting data is widespread within a college. According to that same article in the New York Times, “Iona College in New Rochelle, north of New York City, acknowledged last fall that its employees had lied for years not only about test scores, but also about graduation rates, freshman retention, student-faculty ratio, acceptance rates and alumni giving.”
- Distorting the percentage of students who receive financial aid. One practice, which U.S. News has taken steps to prevent, is for colleges to claim that students who receive Federal educational loans and other non-college funding are “receiving financial aid.” That makes schools more attractive to students who need scholarships.
- Under-reporting average class sizes. At one large university in the Northeast, for example, it is common knowledge on campus that the school under-reports the number of lecture classes that students are required to take.
So, why doesn’t U.S. News simply stop publishing its college rankings in light of the fact that its basic data-collecting methodology is clearly not working? The editors would probably say that the ratings offer students and their families a means of judging which colleges are best, and which provide the best value for the tuition dollars they are about to spend. But speaking of dollars, might they have something to do with why this magazine’s annual college rankings issue is not about to go away?