You have /5 articles left.
Sign up for a free account or log in.
SeanPavonePhoto/iStock Editorial/Getty Images Plus
It has become an annual ritual: U.S. News & World Report releases its controversial “Best Colleges” rankings. Universities that do well in the rankings celebrate; those that don’t are left to grumble among themselves. Given the role the rankings play in the decision-making of college-bound students, most universities have opted to simply shrug and play along and use the rankings to their advantage when they can.
But this year, universities should seriously examine our role in a process that misleads students and families.
All college rankings are problematic. But for several years, U.S. News has adopted increasingly questionable practices for gathering and interpreting its data, which have prompted some universities and professional schools to stop participating in the process altogether. With the version released Sept. 18, U.S. News’s rankings process has gone from bad to worse.
What U.S. News calls “the most significant methodological change in the rankings’ history” has rocketed some institutions up the list and sent others sliding down. Many colleges had seismic shifts in their purported positions on the list. Vanderbilt, the university I oversee, moved from a tie for 13th to a tie for 18th, a fairly modest change. An analysis by our institutional research team found that 107 institutions—25 percent of those included in the “National Universities” ranking—moved 30 places or more.
Does this mean those of us who’ve fallen in the rankings are objectively worse than we were a year ago? Does it mean a university that shot up the list is suddenly orders of magnitude better? Of course not. The shifts in rankings are largely due to the changes in methodology. And U.S. News’s ever-changing methodology—what to count and how much weight to give it—is a subjective exercise that bears little relationship to the mission of our institutions.
The goal of research universities like ours is to provide a transformative education and path-breaking research. To do that, we work hard to attract the most promising students and faculty and to create a collaborative university community where everyone can realize their full potential.
Providing access to students of all backgrounds is a core value for our university and for many others. For our undergraduate students, that means that we, like a number of private universities, admit U.S. citizen (and eligible noncitizen) students regardless of their ability to pay and meet 100 percent of each admitted student’s demonstrated need without loans. Our generous financial aid means every student can graduate from Vanderbilt debt-free.
Last year, Vanderbilt spent $366 million on financial aid, with $244 million going to undergraduates. Most of those undergraduate funds were provided through our Opportunity Vanderbilt student access program, which enabled us to fully cover undergraduate tuition for nearly every family with an income of $150,000 or less. This means that families could send a child to Vanderbilt for less than what it would cost in-state residents to attend many of the nation’s top-ranked public flagship universities.
Our efforts to increase access are paying off. The number of Pell Grant–eligible students among our first-year students has increased by 70 percent over the past decade, and our number of first-generation students has climbed by more than 175 percent.
The U.S. News rankings ignore all this because the data they use don’t measure it. While U.S. News has emphasized how the new methodology puts more emphasis on social mobility, it draws data from the College Scorecard, using only students with federal aid for their measures.
To see why this is misleading, consider two colleges with 1,000 students each. At College A, a single student has a federal loan of $12,000. With its current methodology, U.S. News would report College A as having $12,000 average student indebtedness, a metric that accounts for 5 percent of the overall ranking (up from 3 percent last year). But across all 1,000 students the average balance is actually just $12.
In contrast, suppose that at College B, each of the 1,000 students takes out a federal loan of $10,000. Here, the average indebtedness is $10,000, which is also the value that would be reported by U.S. News. This gives the false impression that College B is more affordable.
Suppose College A wanted to “improve” its rankings. What should it do? Simple—just encourage each of its 999 students with zero debt to take out $100 in debt. Now each student is included in the U.S. News calculation, and we have a new measure of ($12,000+$99,900)/1,000=$111.9. So, by increasing debt for its students, College A vaulted up in the rankings. Of course, encouraging students to increase their debt to improve indebtedness rankings would be absurd and unethical. But the presence of such bizarre implications demonstrates the complete lack of validity of U.S. News’s measure of student indebtedness.
In Vanderbilt’s case, 83 percent of students graduate with no debt whatsoever. If a family chooses to borrow anyway, this is usually done to better manage cash flow on a short-term basis.
These misleading measures matter. For many years researchers have pointed to the problem of “undermatching,” where high-achieving students from low-income and first-generation households do not apply to selective colleges despite the fact that they could be less expensive, thereby significantly limiting their educational opportunities.
The same methodological flaw applies to the rankings’ measure of career outcomes. That measure, a new addition to the rankings this year, tracks the percentage of graduates who earn more than a typical high school graduate four years after graduation, using College Scorecard data.
But since the College Scorecard tracks a limited cohort of students who received federal aid, only about a third of our graduates were considered in U.S. News’s rankings. To put it differently, the U.S. News ranking puts no value on the career outcomes of more than two-thirds of our student body. This misrepresentation is particularly ironic, considering that the main reason so few of our students rely on federally subsidized aid is because of the significant financial support we provide.
In addition, this year’s rankings eliminated key measures of academic quality, such as the percentage of faculty who have attained the highest degrees in their fields, the percentage of entering students who are in the top 10 percent of their high school class and average class size. Other factors that affect the quality of the student experience, including resources available to faculty, were assigned less weight. If a ranking scheme ostensibly concerned with academic quality is reducing its academic quality criteria, what, exactly, is it measuring?
These changes come after a year of turmoil at U.S. News & World Report. Several law schools, led by Yale, withdrew from the rankings last year. There was similar upheaval in the rankings of schools of medicine; some, such as Harvard Medical School, have stopped participating. U.S. News is fully aware of universities’ objections. Our university and others pointed out the flaws in the rankings, to no avail.
Some argue that universities should ignore the annual rankings circus. But many students and families rely on rankings to navigate college choice, and this is where the latest U.S. News rankings really fail. By reducing its measures of academic quality and using biased data for affordability and career outcomes, U.S. News has delivered a misleading guide that is now even less reliable in helping families—particularly first-generation households and those with lower incomes—make one of the most important decisions of their lives.
The annual ritual of the U.S. News rankings release might not end anytime soon. But universities can no longer in good conscience stay silent.