Skip to Content, Navigation, or Footer.
The Tufts Daily
Where you read it first | Wednesday, April 24, 2024

Tufts by Numbers: Are college rankings arbitrary?

It's September of my senior year in high school and I’m sitting on my bedroom floor flipping through The Princeton Review’s "The Best 381 Colleges" with sweaty palms. My future seems so freaky! I’m not even sure if I care about all the categories that have rankings, but I’m already worried about making the wrong choice about them.

Now, as I see tour guides marching prospective students around campus, I wonder about rankings on college comparison lists. Are they simply fuel for an elitist college industry? Or is there valuable knowledge within the huge books found in guidance counsellor offices? To determine my own thoughts on the matter, I checked out the methodology of the Princeton Review, one of the most widely used and validated college admissions services, to see what these future-shaping, sitting-on-my-bedroom-floor-worrying, guidance-counsellor-revered numbers even mean.

The Princeton Review’s website states that the data that appear in their rankings come from surveys of 143,000 students at 381 schools. Each student answers a survey of 80 questions, with responses that go from “strongly agree” to “strongly disagree” or “excellent" to "poor.” Once a student has filled out the survey, the student’s respective college gets a score for that individual's review. Then, as schools are ranked against each other in separate categories, their total score is based on all of these individual scores.

Seems complicated? This is only the beginning. At the bottom of the Princeton Review’s webpage is a note that explains, “The Princeton Review college rankings are different from The Princeton Review college ratings.” I had to re-read the sentence a few times to garner that they meant that there exist separate systems to rank schools against each other and to rate the school in general categories. Some of these general categories are Admissions Selectivity, Financial Aid, Fire Safety and Green. For these ratings, the review board relies on data provided by the administration of each school, either in surveys or in hard numbers from the academic year.

In this niche of college ratings lies the mysterious quality of life rating. As I sat on my bedroom floor, I remember comparing and contrasting this mysterious rating, hoping that if I went to the school with the best quality of life I would be guaranteed to be happy there (I almost feel embarrassed by this melodrama. Instead, I realize that I am simply the product of billions of marketing dollars of colleges. I feel no guilt.)

The first abnormality that I noticed about The Princeton Review quality of life ranking on their website is that it's on a scale from 60-99. For a reader who naturally expects a 100-point scale, this skews how college-hopefuls interpret the rating. I’m also amused by the factors that the quality of life rating includes, such as students’ assessment of “the ease of ... dealing with administrators” and “the interaction of different student types.”

I can’t help but question how rateable these factors -- or even entire colleges -- could ever be. Based on vague categories and easily-confusable rating scales, these ranking and rating systems may benefit the colleges more than the students.