The Washington PostDemocracy Dies in Darkness

The end of college rankings as we know them

February 4, 2016 at 6:00 a.m. EST
A panoramic photo of George Mason University’s campus, a school that has made student well being a focus of its new strategy. (Photo by Creative Services/George Mason University)

This story has been updated to include a response, below, from U.S. News & World Report. Click here to read the response.

It’s been nearly 25 years since U.S. News & World Report introduced the annual version of its college rankings. That’s also when the rankings shifted from what had largely been a beauty contest based solely on a survey of college presidents to one that aimed to replicate the quantitative nature of Consumer Reports.

If Consumer Reports could tell you with some specificity the best washer to buy or the most reliable car on the market, the thinking was that U.S. News could do the same with one of the most expensive purchases in life: a college degree.

But unlike Consumer Reports, which ranked products on how well they performed in daily use, U.S. News decided to rank colleges on the types of students they accepted (SAT scores and class rank), how much they spent on faculty (salaries and class size), and how many students stayed in school and graduated. It was as if Consumer Reports judged products based on the quality of their raw ingredients rather than the final product.

The rankings turned into a big business for U.S. News, even outlasting the print magazine that gave birth to them. They also spawned dozens of copycat rankings from other publications and organizations during the past two decades.

[Three predictions about the future of higher education]

While the U.S. News rankings still loom large among colleges that try anything to improve their position — just see the recent controversy at Mount St. Mary’s University — there are signs that the list is beginning to show its age in an era of changing consumer behavior about picking colleges.

For one, according to an annual survey of college freshmen across the country by UCLA researchers, just 18 percent of students said magazine rankings were important in influencing their final college selection. Rankings didn’t even break the top 10 among the factors students said were important.

The second reason the U.S. News rankings are in trouble is that several new tools and sets of rankings have emerged in recent years that are simply better, including Money magazine, the Economist, the federal government’s College Scorecard, and LinkedIn. They all attempt to do what U.S. News has largely failed to do: measure what actually happens to students after graduation — their jobs and salaries and their level of debt. In other words, they are trying to be Consumer Reports for higher education.

The latest addition to this group is Gallup, which on Thursday announced plans to certify colleges on the “well being” of their graduates. In previous research as part of the annual Gallup-Purdue Index, the polling firm surveyed some 30,000 bachelor’s degree recipients and 1,500 associate’s degree holders nationwide to measure their well-being (that is, being happy, comfortable, and satisfied) in five dimensions: social, financial, sense of purpose, connectedness to their community, and physical health. Just 11 percent of college graduates are thriving in all five dimensions. More than one in six aren’t thriving in any.

Brandon Busteed, executive director of Gallup’s Education and Workforce Development, described the new certification as being similar to the process buildings go through to get LEED-certified for their environmentally friendly designs. Gallup will measure the efforts of universities to improve the well being of their students and faculty, a process that Busteed said might take up to three years and not necessarily lead to certification.

The first school to sign up with Gallup is George Mason University, which made well being a central focus of its recent strategic plan.

“Most of the outcomes people associate with a university now are about employment,” George Mason President Ángel Cabrera told me. “Our goals should be more than that. We claim to engage our students, train our citizens. We need to measure whether we are actually producing such graduates.”

[Someday the school name on your college diploma won’t be the most important thing]

Until recently, colleges were able to get away with telling students and parents to just trust them on the quality of the product. But faith in that assurance is flagging. One major study a few years ago found that one-third of students in a sample of 2,300 undergraduates at 24 colleges and universities made no gains in their writing, complex reasoning, or critical-thinking skills during four years of college. 

Part of the problem is that too many students are sleepwalking through college. They don’t engage enough in what researchers call “high-impact practices” — internships, undergraduate research, study abroad, writing-intensive classes, and interactions with professors. Many of these activities come outside the classroom, and as a result, are often not graded or measured as part of the formal degree program for which students are paying tuition.

The Gallup-Purdue Index already is measuring the impact of those outside-the-classroom activities, such as research projects, on the ultimate career success and well-being of graduates. So, too, are the other rankings that are finally looking at what happens to students after they graduate.

Only by measuring what truly matters in a college — the actual outcome of a degree — rather than how many valedictorians a college recruits for its freshman class, will parents and students be better able to value the return of their investment of going to a specific college. And when that happens, the U.S. News rankings based mostly on prestige won’t matter as much to most high-school students who just want to know if they will get a job, contribute to society, and be happy after graduating from college.

Robert Morse, chief data strategist at U.S. News & World Report, wrote this response:

In “The End of College Rankings as We Know Them” Jeffrey Selingo ignores the fact that outcome measurements – such as graduation and retention rates and alumni giving – account for more than 30 percent of the U.S. News rankings and are the most heavily weighted factors in the Best Colleges methodology. The change in the methodology was made in 2014 to reflect the current state of college admissions and emphasize student outcomes. The U.S. News Best Colleges rankings have been around for more than 30 years and we have participated in the important and ongoing debate about how to best measure a college. Our adjustment to increase the focus on outcomes is one example of how we continually evolve our methodology. When more kinds of reliable outcomes data become readily and permanently available, allowing us to make sound comparisons between colleges and universities, we will seriously consider incorporating it into our ranking methodology. We make methodology and data adjustments when they improve our rankings to better serve the millions of students and families that come to www.usnews.com each month to research colleges.