Why Surveys and Rankings Matter
Why Surveys and Rankings Matter Heading link
It’s hard to imagine anymore how “searching for the right college” could mean searching in any way other than online. And as students sift through the many schools they could apply to, they inevitably glance at rankings. “The Best Business Schools List,” “America’s Best Business Schools,” “The Best MBAs of 2015” – rankings have proliferated to the point that we can no longer consult just one if we’re going to consult any at all.
Colleges covet these rankings, despite their overabundance, and actively compete for recognition. A good ranking can mean any number of things; that students are more likely to enjoy their experience, or land a good job after graduation, or make more money over the long term. Some of the rankings are so well respected that they can confer prestige on faculty, staff, students, and facilities all with a simple number.
Ranking organizations like Bloomberg Business, U.S. News and World Report, and Princeton Review each have their own methodology, but one thing most major rankings have in common is that they survey faculty, students, staff, alumni, and employers before translating their data into a list.
A major assumption underlying all this is that schools with more involved survey respondents will end up with a more representative ranking and that schools who respond poorly, whether or not they are bad schools, will look worse in the rankings. Rankings can only work with available data, after all, and while there’s something to be said for fair methodology, it’s also important that schools who participate do so to the fullest.
To an extent, the ranking is a reflection of the school’s engagement with the ranking process, which can influence what kind of value employers see in a given school’s degree, or whether that school’s graduates earn more in better jobs. So before you hastily dash off a survey response – or worse, don’t respond at all – stop to consider what you might be telling the world about your school.