Do College Rankings Exaggerate the Academic Arms Race?

Hispanic Outlook in Higher Education Magazine
September 7, 2009


By Peggy Sands Orchowski


Every spring, college students, their parents and university officials hold their breaths as they open major news publications listing the latest rankings of the nation’s “top” postsecondary institutions. Will their alma mater, prospective choice and faculty be on it? In what position?


The answer can affect whether or not a student applies to a college, a college increases its applications from top students and its hiring of top faculty, and (most importantly) secures significant financial support from donors and private and public foundations. The answer might even affect the tuition charged at the institution (following market theory, the higher the ranking, the more the institution may feel justified to charge as more students apply and more faculty members are famous).


But do magazine rankings of colleges truly evaluate their performance in educating and preparing students for the competitive global economy, especially value in relation to costs to the student? This is the question that the Center for College Affordability and Productivity (CCAP) and a panel of experts examined at the American Enterprise Institute (AEI, a right-of-center think tank) in Washington, D. C, in mid-May.


The answer from three panels analyzing the role, the harm – or helpfulness, and the new approaches to assess and rank college performance, was nuanced. They found that well-publicized magazine college rankings could be exaggerating the academic “arms race” (the race for higher tuition, top students and world renowned faculty). The consumer demand for and popularity of college rankings could well have moved away from the original mission to serve the need for prospective students to find and finance a quality education that fits their individual educational needs.


The spring madness over college rankings could be said to have been started by the normally sober news magazine U.S. News & World Report. Its first issue ranking the most popular colleges came in 1959; it now sells millions of copies each year. It is said to “influence the matriculation decisions of high school students throughout the world. Some university officials derisively claim that higher education is the victim of ‘management’ by that magazine,” write Luke Myers and Jonathan Robe of the CCAP.


“U.S. News’ philosophy in starting the rankings was not only to serve prospective students but also journalists writing about current trends in higher education, by giving them a transparent way to compare quality universities,” said Robert Morse, director of data research for U.S. News & World Report. “Our process is totally independent of any college. But we always welcome more and better ways of ranking schools. To us, more is better.”


Various organizations are introducing new approaches to evaluating university performance. Several are based on learning outcomes. VALUE, a program of the Association of American Colleges and Universities, distinguishes between assessing which areas of knowledge a college covers (such as humanities, sciences and global world cultures – the top fields) and assessing which skills and abilities are taught and measured. These include skills such as writing, critical thinking, quantitative reasoning, oral communication, intercultural, information literacy, ethical reasoning, civic engagement, application of learning, research and integration of learning.


Although student graduation rates would seem to be a natural place to measure college outcomes, it proves one of the most difficult to measure, according to panelists. A decreasing number of first-year students who enter any college are likely to graduate from that institution; fewer still will graduate from any college in four years. Increasing numbers of graduates from four-year-degree programs started at community colleges. In addition, there is little tracking of the ever fluctuating enrollment of community college students who eventually may go on to complete a four-year college degree (after completing an A.A. degree or transferring in the sophomore year, for instance) or even go on to graduate work or who stop their higher education with the A.A. degree or a certificate.


“It’s easier to target departments than the overall campus completion rates,” sighed Steven Goodman, former executive director of the College Admissions Institute of America. That’s why so many college ranking systems focus on college application rates rather than college completion data.


“It’s time to put rankings in the sports pages and international convergence (a form of technology transfer that brings nations at different levels of development to the point of embracing the same paradigm) on the front page,” said Clifford Adelman, a senior associate at the Institute for Higher Education Policy (IHEP). “As the primary function of all institutions of higher education is the distribution of knowledge, the most important issue on our plates is the quality, meaning and transparency of the degrees awarded. Let’s dispose of rankings quickly and move on to more significant territory,” he concluded.


As usual at most AEI conferences on education, its saucy visiting scholar Richard Vedder, professor, Ohio University, and author of Going Broke By Degree: Why College Costs Too Much (AEl Press, 2004), was saved for last to offer his unique perspectives. “Confidence in higher education is falling at the same time that costs are increasing,” the director of the CCAP summarized. “University arrogance and elitism have probably contributed to the fact that for the first time in our history, a college education may not be part of the American dream, leading to greater inequality in America. Consumer-driven popularity rankings add to this by failing to track graduation outcomes, by not ranking highly the variety of colleges that are not research-oriented, and by allowing colleges to resist including data showing that many ranked features, added at high cost, actually add little value to the education received.”


“Rankings of universities should include alternative college rankings for nonresearch institutions,” said Vedder, a former member of President Bush’s Secretary of Education’s Commission on the Future of Higher Education. “They should be more student-based. And they should eventually include rankings of former students’ earnings 10 years after graduation from the institution.”


<< Back to Steve in the News