By most accounts, Singapore has a stellar education system, including being home to many world-class institutes of higher learning (IHLs).
Most recently, local news reports were proudly celebrating how National University of Singapore (NUS) and Nanyang Technological University (NTU) were tied for 11th place among all the universities globally.
But if we were to accept the premise that the rankings mean something, then what should we make of the fact that Singapore Management University (SMU) was ranked at 477th place?
If we believe the rankings give an accurate and impartial assessment of universities around the world, the conclusion then is that we have two great local universities and one that is lagging horribly behind.
The DollarsAndSense Podcast: Are Private University Graduates At An Unfair Disadvantage When Finding Jobs In Singapore?
But as I think most Singaporeans would agree, SMU surely can’t be that far behind in terms of quality of education provided, resources available to faculty and students, and opportunities for career advancement and further education.
Perhaps the problem lies in the way the rankings are derived? With the pride of thousands of SMU students at stake, DollarsAndSense investigates.
Who Makes University Rankings, Anyway?
The global ranking referenced in this recent news cycle was from Quacquarelli Symonds (QS), a private company that provides student recruitment services to universities and run networking conferences for academia, among other things.
It is worth noting that while widely respected and often-cited, the QS World University Rankings is just one of dozens of global university rankings, alongside the Times Higher Education World University Rankings, the Academic Ranking of World Universities, and many more.
In addition, there are also regional and national university rankings, maintained by regional research centres, respected publications, and governmental/non-profit organisations.
The point is: there is no single, authoritative ranking of universities. Because of this, universities can simply pick and cite rankings that paint them in a favourable light when marketing themselves to prospective students.
How Are The QS World University Rankings Derived?
The QS World University Rankings scores universities on four groups of indicators: Academic, Employer, Student, and International.
Let’s take a look at what these group of indicators comprise of and how they are scored.
Academic Reputation:This is scored based on surveys among academics around the world. QS has highlighted that this is the “centerpiece” of their rankings and carries a weighting of 40% of the final rankings.
H Index:Scored based on productivity and impact of the published research works by each institution.
Citations per Faculty: Scored based on the number of citations earned by the university, adjusted for faculty size.
Staff with PhDs:This indicator is based on the proportion of PhDs within each institution.
Employer Reputation: This score is based on a survey of employers’ perception of each institution. QS has stated that employer reputation accounts for 10% of the final rankings weightage.
Employers’ Presence On Campus: Based on the number of employers who have an active presence on a university’s campus.
Graduate Employment Rate: The percentage of a university’s graduates who are employed up to 12 months after graduation.
Alumni Outcomes: Aims to evaluate how successful alumni of institutions have been.
Faculty/ Student Radio: Looks at the proportion of staff in relation to the number of students.
Student Exchange Inbound: Scored based on the amount of foreign students.
International Faculty Index: Scored based on the proportion of faculty members that are non-local.
International Student Index: Scored based on the proportion of students that are non-local.
Glaring Flaws In The QS World University Rankings
While perhaps no methodology for ranking universities will ever be perfect, it doesn’t take someone with a PhD to see some of the glaring biases and flaws in the current way QS does its global university rankings.
For one, it favours well-established universities over newer ones. This creates a self-fulfilling prophecy where well-known universities are perceived to be better, and the rankings confirm this perception.
Some older universities might offer a poorer educational experience to students compared to newer ones, but still score higher, since much of the scores are based on perception of employers and other academics.
Another obvious issue is the implicit valuing of foreign students within a university’s student population. How well these foreign students integrate with the rest of the study body and the quality of these foreign students are not taken into account.
Does it make sense that universities can simply increase their quota for foreign student to score better in Student and International indicators, and consequently climb up the rankings?
Take Rankings With A Pinch Of Salt
By going just a little deeper beyond the headlines, we can have a better perspective on what to make of these rankings.
In the case of the QS World University Rankings, it does seem like the indicators can be easily “gamed” by clever universities with the resources and desire to climb up the rankings.
Of course, if in the process it leads to a better academic experience and campus life for the students, then it’s a positive development for everyone concerned. However, prospective students should look beyond these rankings, and understand for themselves what studying at a particular university is like and what their future prospects might look like after graduation.