Op-Ed: The tyranny of college rankings – and why we must leave them behind
For all those disappointed applicants hoping to get into a school highly ranked by US News & World Report or a similar publication, take heart.
This is your chance to be freed from the tyranny of college rankings.
“Tyranny” is not too strong a word. The people who publish college rankings wrap their products in an alluring veneer of professional expertise and statistical rigor. They express their ratings in catchy numbers, presented in descending order (from 1 to 391, in the case of the US News “list of national universities »).
According to this list, UCLA, ranked 20th, is better than USC, ranked 27th. So it must be true.
Where is it? If you look at the methods used to produce these numbers, you’ll see that the whole enterprise, like the Emerald City in the land of Oz, consists mostly of blue smoke and mirrors.
Consider the formulas used by workbooks to calculate these numbers. Every step in the process – from the selection of variables, the weights assigned to them, and the methods for measuring them – is based on essentially arbitrary judgements.
US News, for example, selects 17 metrics for its formula from hundreds of choices available. Why does it, for example, use students’ SAT scores, but not their high school GPAs? Teachers’ salaries but not the quality of teachers’ teaching? The elders give, but not the income of the elders? Why doesn’t it include things like a school’s spending on financial aid or its racial and ethnic diversity?
Likewise, the weights used to combine these variables into a total score are completely subjective.
US News has somehow concluded that a school’s six-year graduation rate is worth exactly 17.6% of its overall score, but its student-faculty ratio is only worth 1%. In judging the “academic reputation” of a school, it gives a whopping 20% weighting to the opinions of school administrators. other colleges, most of whom know very little about the hundreds of schools they are asked to review — other than where those colleges were in the previous year’s rankings. And the publication gives no weight to the opinions of students or graduates.
Also, different workbooks use different ways to measure each variable. Consider graduation rates. Some are based on the percentage of students who graduate in six years. Others use a measure of eight. Some include transfer students, some don’t. Rankers sometimes measure “student excellence” by matriculators’ average SAT scores or high school GPAs or high school ranking in the class. Others use admissions acceptance rates or yield rates.
Even if you think the ranking formulas make sense, the calculations behind them are based largely on unaudited and unverified data self-reported by the schools themselves ranked. Would you invest in a company based on such information?
Throughout their history, college rankings have been plagued with allegations of data fabrication and manipulation. Over the past month, USC, Colombia University and Rutgers The universities have all been accused of submitting “erroneous” or false reports to US News, and a dean at Temple University was sentenced to prison in March for a fraudulent scheme to boost the school’s prestige.
Most observers believe that these public revelations represent only the tip of a very large iceberg.
Behind the manipulation of data lies the even bigger problem of schools altering their academic practices in a desperate attempt to gain ranking points. Examples include inflating a school’s fall semester “class size index” by moving large introductory lectures to the spring semester. Or by increasing its rate of return by expanding early admissions and merit-aid programs that primarily benefit wealthy applicants at the expense of needy applicants. Or improve graduation rates by relaxing academic standards.
Finally, the rankings impose a single formula model on hundreds of wonderfully diverse institutions. For example, US News adds Caltech, Santa Clara University, Chapman University and Fresno State, as well as UCLA and USC, to its long list of national universities, as if they were all fungible examples of a uniform product differing only in relative status.
In short, popular “best college” rankings attempt to force American colleges and universities into a rigid hierarchy, based on arbitrary formulas, fueled by unreliable data.
Instead of relying on someone else subjective idea of what one should want in a college, applicants should ask themselves what they or they want it to serve their personal purposes. Do they see college as a way to immerse themselves in a particular field of study? Obtain an impressive pedigree? Qualify for an economically rewarding career? Preparing for community service? Are you looking for guidance for a meaningful and fulfilling life? Or something else?
Part of the tyranny of college rankings is their allure of simplicity. They promise to reduce the complexity of college choice to a simple number. But choosing a college is anything but simple. College is one of the most complex “commodities” money can buy. And these four years are an extremely important period of exploration and personal development.
Choosing a college should be approached as an exercise in self-discovery. Getting rejected by a highly rated school by a complete stranger can be just the thing to set applicants on this path.
Colin Diver is the former president of Reed College and former dean of law school at the University of Pennsylvania. He is the author of “Breaking Ranks: How the Rankings Industry Rules Higher Education and What to Do about It”.