Co-authored by former IIT Delhi director and BITS Pilani, VC V Ramgopal Rao, the paper notes problems, scope for bias in research, perception, diversity, other parameters.
Shradha Chettri | August 20, 2024 | 07:42 PM IST
NEW DELHI: An analysis of the National Institutional Ranking Framework’s rankings has revealed “inconsistencies” – including huge fluctuations in positions – that “raise…concerns about their reliability”.
The paper, “Unpacking Inconsistencies in the NIRF Ranking”, authored by V Ramgopal Rao, group vice-chancellor of Birla Institute of Technology and Science, Pilani, and Abhishek Singh, also of BITS Pilani, was published in the June edition of the Current Science Journal. Rao was also director of Indian Institute of Technology (IIT) Delhi, placed second in the NIRF ranking 2024 of engineering colleges. The ninth edition, NIRF ranking 2024, was released on August 12.
The authors have not only looked closely at the outcome of the ranking exercise but also at the process itself - the parameters, sub-parameters, and weightages assigned to them. They found that for research, NIRF relies mainly on bibliometrics – the quantities of citations and publications – while neglecting other forms of research output and interventions. The process allows wide fluctuations; some parameters leave scope for bias or suggest flawed survey methods; others rely entirely on single points, such as number of courses on SWAYAM portal.
“The present study has identified several inconsistencies, thus raising concerns about their reliability,” says the paper. “These include huge fluctuations in the rankings, overemphasis on bibliometrics neglecting non-traditional research outputs, subjective nature of perception rankings that introduces biases, challenges in the regional diversity metric, overlooking teaching quality, inadequate transparency in methodology, questions about data integrity and limited global benchmarking.”
The NIRF was approved by the education ministry – then known as the ministry of human resource development – and launched on September 29, 2015. The broad parameters include the following:
Teaching, learning and resources
Research and professional practices
Graduation outcomes
Outreach and inclusivity
Perception
The “teaching, learning and resources” section includes student strength, faculty-student ratio, faculty with PhD, financial resources, online education, multiple entry and exit, Indian knowledge systems and regional languages.
Also read ‘Goal of an institution larger than securing 1st or 2nd position in NIRF rankings’
The research and professional practices include publications, citations, patents, research projects and publication and citations in sustainable development goals (SDGs). The graduation outcome entails placement and higher studies, median salary, PhD students.
The “outreach and inclusivity” part looks at regional and gender diversity, economically and socially-challenged students and physically-challenged students. Perception has further two sub parameters – academic peer perception and peers and employers.
Rao and Singh’s paper highlights how wide fluctuations in ranks every year is a matter of concern. Till the top 20 position, there is stability but below that, positions can fluctuate wildly.
It states, “While some fluctuations may be attributed to genuine changes in performance, others might result from factors beyond the control of an institution, such as temporary variations in data reporting or interpretation errors. Unlike some international ranking systems, such as the QS World University Rankings, which utilise a damping mechanism to spread large, interannual swings, the NIRF rankings lack a similar mechanism.”
The paper acknowledges other lacunae in the QS Rankings but states their specific approach minimises the impact of potential temporary fluctuations.
“The fluctuations in rankings can have a significant impact on the perceptions of stakeholders, including students, parents and potential collaborators. Institutions experiencing large rank changes may find it challenging to manage the perceptions associated with such variations, potentially affecting their credibility and reputation,” added the paper.
Then, the paper notes the ”heavy reliance on bibliometrics” and corresponding “neglect” of non-traditional forms of research contribution and knowledge building.
The analysis published by the ministry on NIRF stated that only the top 100 institutions were contributing significantly in terms of publication, patents, faculty student ratio and other parameters.
“Bibliometrics, while offering a numerical perspective on research impact, falls short in encompassing critical elements like relevance, innovation, social impact and contributions beyond traditional publishing. The overemphasis on bibliometrics raises concerns about the comprehensive evaluation process, especially within the research and professional practice metric. Exacerbating these problems is the bibliometric methodology employed by the NIRF, which relies entirely on commercial databases for data collection. This dependence reveals shortcomings in terms of scope, precision and the incorporation of non-traditional research outcomes,” the paper explains.
There is also a concern that an excessive focus on it may discourage academics from addressing local issues and regional objectives.
The paper says that including the perception parameter has added a “subjective” aspect to the ranking and there is a need to bring transparency.
“Metrics based on surveys or subjective assessments carry risks of manipulation or biases, underscoring the need to maintain the integrity of the perception metric. It may be observed that certain legacy institutions exhibit lower perception scores compared to several relatively newer institutions. It raises the possibility that the pool of academicians and employers, whose perceptions contribute to these rankings, may not be as engaged or interested in participating in the survey for well established institutions,” added the paper.
On regional diversity parameters, the paper finds that the NIRF is skewed against institutions in states with larger populations.
“To address this, normalisation methods similar to those employed by international higher education ranking agencies for internationalisation may act as one of the solutions . The normalisation process may involve selecting a population measure, such as the overall population or ages 15–25, for each state. Utilising the total population of each state, a logarithmic transformation may be applied to these numbers to mitigate size-related discrepancies,” added the paper.
A paper published in 2021 titled “NIRF regional realities for intervention” by G Srinivas and S Salil also highlighted regional disparities with 80% of high-ranking institutions coming from four states in 2020.
The paper states on the online education parameter, the assessment is entirely focussed on the quantity of courses developed and made available on the SWAYAM portal. SWAYAM is the ministry of education’s online education platform.
The ranking is also known to lack specific mechanisms to directly assess teaching quality, overlooking crucial aspects such as classroom observations, student evaluations and alumni feedback, says the paper.
“Practical skills, including effective communication, problem solving and critical thinking, are integral to a holistic education. However, limited focus of the NIRF rankings on practical training elements, such as hands-on projects and internships, leads to an undervaluation of institutions that prioritise experiential learning,” states the paper.
It adds that the reliance of NIRF on self-reported data raises questions regarding the consistency and accuracy of the information presented.
“Without standardised reporting practices, the rankings may inadvertently favour institutions adept at presenting data in a favourable light rather than those genuinely excelling in academic parameters,” it states.
Follow us for the latest education news on colleges and universities, admission, courses, exams, research, education policies, study abroad and more..
To get in touch, write to us at news@careers360.com.
The National Medical Commission’s survey found that over 56% PG students don’t get weekly offs; 65% MBBS students have wanted to quit and over 50% teachers think students use mental stress is ‘an excuse’
Sanjay