September 18th, 2024

From Myth to Measurement: Rethinking US News and World Report College Rankings

The U.S. News college rankings face criticism for flawed methodology, encouraging manipulative behaviors among institutions. An alternative system focusing on outcomes like student success and affordability is proposed.

Read original articleLink Icon
FrustrationSkepticismAmusement
From Myth to Measurement: Rethinking US News and World Report College Rankings

The U.S. News & World Report college rankings, once seen as a helpful tool for evaluating higher education institutions, have come under scrutiny for their flawed methodology and the unintended consequences they create. Critics argue that the rankings are based on subjective measures, such as peer assessments, which can distort institutional priorities and incentivize manipulative behaviors. For instance, some universities have resorted to creative marketing tactics, like sending promotional items to peers, to influence their rankings. This has led to a culture where institutions prioritize ranking improvements over genuine educational quality. The rankings often measure inputs, such as wealth and exclusivity, rather than meaningful outcomes like student success and satisfaction. To address these issues, a proposed alternative ranking system would focus on post-graduation success, affordability, alumni satisfaction, and social mobility, aligning incentives with actual educational value. The current rankings exemplify Goodhart’s Law, where a measure becomes ineffective when it is used as a target. A critical approach to these rankings is necessary, as they can mislead students and parents, divert resources from educational improvements, and ultimately harm the educational landscape.

- U.S. News college rankings are criticized for relying on subjective measures and creating a distorted view of educational quality.

- Institutions may manipulate data and engage in marketing tactics to improve their rankings.

- A proposed alternative ranking system would focus on outcomes like employment rates and student satisfaction.

- The current rankings can mislead students and parents, leading to poor decision-making.

- A more meaningful evaluation of colleges should prioritize genuine educational value and student success.

AI: What people are saying
The comments reflect a range of perspectives on college rankings and their implications for higher education.
  • Many commenters agree that college rankings are flawed and often manipulated, leading to negative consequences for institutions.
  • There is a consensus that reputation plays a significant role in perceived educational quality, creating a self-reinforcing cycle.
  • Some commenters highlight the impact of rankings on student debt and the importance of focusing on outcomes like affordability and success.
  • Critics argue that the current ranking systems contribute to social stratification and perpetuate inequalities in access to education.
  • Several comments mention alternative systems, such as the U.S. government's College Scorecard, as potential improvements to the ranking methodology.
Link Icon 12 comments
By @ineedasername - 7 months
Here's an insider's insight: Colleges hate these rankings too, including the shit eating grin they have to plaster on their face when they go out in public and say things like "We're one of the top 10 small public non-land grant research institutions in the upper middle pacific northwest region!"

But too many families have been convinced that rankings like this are useful so now the institutions have little choice but to become complicit in the ridiculous system or have their enrollment decimated and be unable to pay its bills.

By @lordnacho - 7 months
College rankings are a bit like corruption rankings: they are measurements of perceptions. Not only that, perceptions are heavily influenced by previous rankings.

If your ranking doesn't show Harvard and MIT near the top, it is wrong. If your corruption index doesn't show Scandinavia and Western Europe above Africa and South America, it is wrong.

If it's wrong, people will not want to read it. If you do an "objective" ranking and it doesn't quite say what you expected, you need to weight things differently, so that your ranking has credibility.

At best you can do a bit of massage to show that you are actually doing something when compiling these rankings, and you might be able to highlight a few trends. But in the end, if you are not showing the usual suspects in the usual places, people will not believe you.

Reputation, at the end of the day, moves slowly.

By @r00fus - 7 months
Modern US society seems to be layers upon layers of gamified results with little coherent vision. We blame other countries for having "stifling" regulation, "burdensome" oversight and "top-down" planning, but honestly - it seems we simply let problems precipitate for decades on end and in many cases think it's some sort of achievement.
By @bbor - 7 months
Ha, pretty funny article. Well written for sure, but it's got major "if only they let ME run things, everything would be fixed in day!" vibes. College rankings aren't broken because some data scientist made a bad decision, they're broken because they're an essential part in the contemporary American class system: sending your kid to SAT bootcamp and then a "prestigious" university is one of the few ways the top 10% can separate their children from their poorer peers. Private high school doesn't really go that far these days, other than for social conditioning and getting them into a good university where the real networking is unlocked.

All of the above applies tenfold for foreign students coming to prestigious US universities, as they pay exorbitant sums to get the name recognition, creating all sorts of weird incentives.

As someone who went to a somewhat prestigious university (Vanderbilt), fingers crossed we nationalize the whole system sometime soon... I think we can all agree that focusing on football, campus amenities, and marketing aren't where these billions should be going. Vanderbilt, to their credit, gamed the rankings a ~decade ago by offering to meet 100% of student's government-determined financial need with grants, which is probably the best outcome possible of this weird system.

By @mumblemumble - 7 months
My favorite thing I've ever heard said by the principal of our kids' elementary school: "Our test scores are down, which is great, maybe that will keep some of the school shoppers away this year."

Our city has a school choice program that includes a portal where you can look up these kinds of quantitative measures, and I think I agree with him. Tiger parents slosh from school to school as they chase after rankings, and, much like ill-contained liquid cargo in ships, all that motion tends to destabilize and capsize schools.

Sadly, I don't think smaller higher education institutions can afford to take such a relaxed attitude about it. They don't get to have an enrollment backstop in the form of a semi-captive audience of parents who live nearby and aren't hyperactive enough to commit to spending upwards of an hour every weekday trucking their kids back and forth across town.

By @jackcosgrove - 7 months
The OG listicle.

I do think it's a bit difficult to separate reputation from educational quality. Educational quality is affected by professors, facilities, research opportunities, etc, but these factors are saturated at a lot of research universities. Another factor of educational quality is the peer group, and in this sense reputation matters a lot. There's an "if you build it, they will come" circularity here, where if you have a good reputation, you will attract students who will burnish your reputation. Conversely institutions can get stuck in a loop of low reputation and marginal students.

If however, you are armed with the knowledge that the non-circular components of education quality are saturated at, say, the top 50 universities in the US, well that opens some doors. You can rest easy that your flagship land grant university honors program is giving you the same education you'd receive at an Ivy League school.

Personally I think student debt load and chosen major matter way more than which school you go to.

By @DiscourseFan - 7 months
I agree mostly with the article, but this stuck out to me

>Measure the proportion of graduates who pursue and are accepted into advanced degree programs. (adjusted for field of study so acceptance into medical school > masters in art history)

The only reason, however, that medical school is more competitive than an art history masters (to be fair, most paid masters programs aren't super competitive) is because the medical board has set up the system of accreditation to limit the number of practising doctors in the US in order to artificially inflate the salaries of doctors. The UK, for instance, has the opposite problem, where becoming a doctor is much easier, but for that very reason their pay is relatively low and the job is increasingly undesirable (not to mention cuts to the NHS).

This is all to say that, in just this instance, the author falls prey to precisely the same mystification that he is criticizing, by seeing something as "better" just because its more exclusive.

By @gxonatano - 7 months
> Instead of measuring an institution’s wealth or selectivity, it could track metrics like:

> Employment Rates: Track the percentage of graduates employed in their field of study within six months to a year after graduation.

> Earnings Data: Use data on median salaries adjusted for regional cost of living and industry to assess economic outcomes.

This is a toxic way of looking at education, in my opinion, like it's some kind of vocational training. By this metric, two-month training programs in high-demand fields would be the highest ranked, by far. But often, the subjects which have the deepest impact on one's thinking—and on one's curiosity, intellectual abilities, and life in general—are the ones which have no easy one-to-one mapping to a job. I've taken philosophy courses, and anthropology courses, which have changed my life completely, but I didn't get employment as a philosopher or anthropologist within six months after graduation. The value of education is in how much it can change your thinking, satisfy your curiosity, and make you into a well-rounded person. It's not just whether it can prepare you for some job. And if you reduce education to "median salaries adjusted for regional cost of living" then you're measuring education purely on a sell-out basis.

By @paulpauper - 7 months
Even if no one published ratings, people have a general intuitive idea what the top 20 colleges are, which are fixed even if the order changes. MIT will always be top 10 for example. Harvard will always be top, too. The acceptance rate isa a good proxy for the ranking. Or success at the job market when applying.
By @ocean_moist - 7 months
Here is a relevant tool that ranks colleges (and degrees) by ROI/EV: https://www.collegenpv.com/programrankings/?pcip=11&page=1&s.... Not perfect, but better.

As someone who just played the college admissions game last year, these (US News) rankings pretty accurately reflect the *perceptions* of the general, college applying, public (or maybe they are the source of those perceptions). They aren't really even good at that outside the T10.

By @RecycledEle - 7 months
I wish someone would measure the bang-for-the-buck without considering politically correct metrics.
By @rswerve - 7 months
Too few people know about the US government’s own attempt to offer a better not-exactly-a-ranking system. It’s pretty good. https://collegescorecard.ed.gov/