This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
This has been a huge learning opportunity for those of us who have been involved in this effort.
Receiving such a large response from educational institutions was the first surprise. That the institutions
were so keen to position themselves amongst their peers was a very pleasant thing to learn. This speaks
volumes for the institutional aspirations to move towards quality, and augurs well for the future evolution of
higher educational institutions.
In this respect, it is relevant to mention that response from universities, engineering, management and
pharmacy institutions was very enthusiastic, even overwhelming and therefore, particularly gratifying. The
response was uniformly spread across Government-funded and privately-funded or self-financing
institutions. This makes the ranking effort truly representative and worthwhile.
However, despite wide publicity, the response from the Architecture and general degree colleges fell short of
being truly representative. Hence, for this year, it was decided not to rank institutions belonging to these two
categories. We offer apologies to institutions in these two categories that took the trouble to register and
populate the data; but it became clear to us that there were significant pockets of relative vacuum. In the
absence of representative participation, the ranking exercise here would have become meaningless.
On the down side, the associated learning has been that educational institutions have to be more careful
about the data. Many institutions failed to put their best foot forward, when it came to supplyingquality data
The point being emphasized here is “quality ”. Some of the institutions were definitely casual in supplying the
data sought. Reliability of an exercise like this depends entirely on the reliability of the data. This is clearly an
area in which further work is needed. Institutions need to appreciate that supplying honest and reliable data
is important for their own image of how they want to be seen by their pupils and the World at large.
In fact almost a major fraction of the ranking effort has gone into improving data quality. A large number of
our volunteers were deeply involved here. We used an outliers based approach for the detection of bad data
points, contacted the concerned institutions repeatedly to examine the suspected figures and to supply the
corrected values instead. While some amount of automation was used for the task, significant humaninterventions were needed for which our officials, volunteers and staff (both at NBA and INFLIBNET) spent
many sleepless nights. We are grateful to many professional volunteers involved in the exercise, who gave
their time for this task selflessly.
In fact, the major learning from this exercise for the future is to give deeper attention to this problem. Going
forward, we need to find ways for many of the relevant data to be populated directly from a few independent
India Rankings 2016 I
8/18/2019 iifm bhopal
http://slidepdf.com/reader/full/iifm-bhopal 5/41
sources. There is need for organizing a major effort to create such sources for independent data, as well as for
doing away with parameters for which such sources cannot be reliably identified.
To put this in a better perspective, the data on research publications and citations was taken directly from
resources published by Elsevier (Scopus), Thomson Reuters (Web of Science) and the Indian Citation Index
We had little problems with data taken from these sources.
We have taken every care to be objectively neutral in our work here. However, it became necessary at times to
effect data corrections, where common reasoning dictated that there was an obvious error due to
carelessness. However, the number of such interventions is very small and in all cases had little possibility to
impact the over-all ranking process.
In spite of taking all care, we are aware of the many pitfalls and deficiencies of a first attempt, and the possible
imperfections in the results. We seek indulgence of all stakeholders and hope to work with them in
developing the future versions of this national exercise.
Surendra Prasad
Chairman, NBA
India Rankings 2016 ii
8/18/2019 iifm bhopal
http://slidepdf.com/reader/full/iifm-bhopal 6/41
1. Background
The process of framing National Institutional Ranking Framework (NIRF) began on October 9, 2014 with constitution
of a 16-member core committee under the chairmanship of Secretary (HE), Ministry of Human Resource
Development. The terms of reference of the Committee were to suggest a reliable, transparent and authentic National
Framework for performance measurement and ranking of institutions for higher education and to recommendinstitutional mechanisms, processes and timelines for implementation of the National Institutional Ranking
Framework. The framework was given a final shape subsequent to intense discussions and deliberations during a
series of meetings of the Committee and exchanges with peers and stakeholders through many online discussions.
The National Institutional Ranking Framework (NIRF) for engineering and management categories was unveiled byth
Smt. Smriti Irani, Honourable Minister for Human Resource Development on 29 September, 2015. Soon after, the
NIRF for ranking of the pharmacy and architecture institutions as well as for colleges & universities were released. The
NIRF envisaged separate rankings for different categories of institutions in their own respective peer groups. Further
within each discipline, there was provision for separate ranking in two categories – institutions that are engaged in
research and teaching (Category A), and those engaged mainly in teaching (Category B).
The final framework identified nearly 22 parameters in five major heads, several of them are similar to those employed
globally such as excellence in teaching, learning and research. However, there are a few which are India-centric
reflecting aspirations of the rising numbers of young people of a vast nation. Country-specific parameters relevant to
the Indian situation include regional and international diversity, outreach, gender equity and inclusion of
disadvantaged sections of society.
“The Ranking framework will empower a larger number of Indian
Institutions to participate in the global rankings, and create a significant
impact internationally too. I see this as a sensitization process and an
empowering tool, and not a tool for protection.”
Smriti Zubin Irani, HRD Minister
“I don't see this becoming encompassing and exhaustive in next two years,but a beginning to that end. Our framework is a moderated version of QS
and is self-reporting. Though all central universities will be part of it, I hope
good institutions from private as well as other sectors will aspire to be part of
rankings,”
Vinay Sheel Oberoi, Secretary (HE), MHRD
India Rankings 2016 1
8/18/2019 iifm bhopal
http://slidepdf.com/reader/full/iifm-bhopal 7/41
2. NIRF Parameters for Ranking of Institutions
The NIRF provides for ranking of institutions under five broad generic parameters, namely: i) Teaching, Learning and
Resources; ii) Research, Consulting and Collaborative Performance; iii) Graduation Outcome; iv) Outreach and
Inclusivity; and v) Perception.
3. Metrics to Compute Ranking Scores
The discipline-specific ranking frameworks for all the six categories of institutions mentioned above are available on
the NIRF Web site (https://www.nirfindia.org/). These documents identify the relevant data required to suitably
The NIRF has been drafted to provide an Indian context to
educational aspirations and needs”.
Smriti Zubin Irani, HRD Minister
Teaching, Learning and Resources (TLR)
Outreach and Inclusivity (OI)
Faculty Student Ratio - Permanent Faculty (FSR)
Faculty Student Ratio - Visiting Faculty (FSR)
Metric for Faculty with Ph.D. and Experience (FQE)
Metric for Library, Studio & Laboratory Facilities (LL)
Metric for Sports and Extra Curricular Facilities (SEC)
Consultancy, Education Programme, Students, Public Examinations, Student Events and Facilities for the PhysicallyChallenged. Separate Data Submission Utility was designed for each of the six disciplines.
The Data Submission Utility provided for generation of XML file for all twelve worksheets (except Read-me-first) once
all worksheets are duly filled-in and validated. This XML file could then be uploaded onto the NIRF website.
5.2.3 Publications, Citations and Collaboration: Web of Science, Scopus and Indian Citation Index
Three citation databases were used as sources for retrieving the number of publications, citations and collaborative
publications for institutions who registered themselves for ranking in one or more disciplines. These citation
databases are: i) Science Citation Index (SCI), Social Science Citation Index (SSCI) and the Arts and Humanities
Citation Index (A&HCI) hosted on the Web of Science platform; ii) Scopus; and iii) Indian Citation Index. Google
Scholar, initially proposed as one of the sources for publications and citations, was dropped as a resource since it does
not facilitate searching of publications and citations by names of institutions. These three sources of publications and
citations cover almost all disciplines.
These three databases were searched to determine the quantitative productivity of all 3,565 institution who
registered themselves for ranking in terms of research articles published by them and citations received by these
publications in a span of three years, i.e. 2012, 2013 and 2014.
India Rankings 2016 4
8/18/2019 iifm bhopal
http://slidepdf.com/reader/full/iifm-bhopal 10/41
5.2.3.1 Search Strategy for Retrieving Research Publications, Citations and Collaborations from WoS, Scopus and ICI
All permutations, combinations and changes in the names of institutions were used while searching for articles
published by faculty and researchers in the three databases mentioned above. It was realized that although Scopus
had affiliation IDs, at times multiple IDs for the same institution, it did so only for a limited number of Indian
institutions. As such, search strategy had to be defined and used for retrieving publications, citations and
collaboration data from all the three citation indices. Since searches were conducted using names of institutions,
articles that did not have institutional affiliations of their faculty and researchers could not be retrieved.
Several universities host other research institutions within their physical premises. Care was taken to ensure that
credit for a publication and citations is given to the correct institution. Manual checking of retrieved data was done in
cases i) where two institutions with same (or similar) names share the same physical premise; ii) institutions having
the same name in the same city, for example Government Colleges, DAV Colleges, etc.
Universities Handbook 2014 (Association of Indian universities, 2014) and web sites of institutions were used to find
changes in the names of institutions. Many variations in names of universities, their physical locations and their
spellings were discovered during the searches in all three databases. The task was challenging. The NIRF has taken
every care to be accurate on this count, and our relevant academic partners, Elsevier (Scopus), Thomson Reuters (Web
of Science) and Indian Citation Index have validated the results on a sample basis.
5.2.3.2 Restricting Retrieval of Articles to a Given Discipline
Subject-wise and discipline-wise search was not done in cases of universities and degree colleges applying for ranking
under universities and colleges respectively. Likewise, subject-wise or discipline-wise search was not made for
institutions largely devoted to subjects under which ranking is sought, e.g., the IITs and other engineering institutions,
colleges of pharmacy (for pharmacy) and IIMs (for management). However, subject category-wise search was made in
the following cases:
i) Universities applying for ranking under other subject categories. Example: Jamia Milia Islamia has also
applied for more than one disciplines.
ii) Institutions applying for ranking in multiple categories. Example: IIITM, Gwalior has applied separately for
ranking in Management and Engineering disciplines.
5.2.4 Perception
Perception is a very important parameter for getting the feedback of the stakeholders. In order to get a calibrated view,
it was decided to elicit feedback separately from both: carefully identified peer groups as well as from general public
including students, parents and employers who wished to provide such a feedback. For the latter, a series of publicadvertisements were issued and the public response was received on the NIRF portal, specially created for this
purpose. The response was extremely enthusiastic. We received nearly 1,50,000 votes from nearly 60,000 individuals.
Many of these were for Category B institutions.
The peer groups were selected from a list of eminent and accomplished individuals from academia, industry, R&D
institutions and other Government organizations. Special letters were sent to them inviting them to provide feedback
directly on the NIRF portal. We received votes for about 500 institutions from the peer groups.
India Rankings 2016 5
8/18/2019 iifm bhopal
http://slidepdf.com/reader/full/iifm-bhopal 11/41
For a first time effort, the response indicates a fair level of interest and participation from both general public and
peers.
5.3 Data Mining and Data Verification
Issues and pit-falls in the process of data collection, verification, authentication and interpretation were addressed by
the Implementation Core Committee (ICC) set-up by the MHRD to oversee the implementation of ranking work for the
year 2015. Besides, committees consisting of experts from adademic institutions were constituted to examine thedata submitted by institutions under each of the five broad generic parameters, for each discipline. These Committees
examined the data on various parameters minutely and identified outliers and anomalies for further scrutiny. Such
institutions were contacted telephonically and via e-mail to confirm or correct the Data. Nodal officers from some of
the institutions were also called in person to interact with the members of the committee and verify their data. For
increased transparency, the final data as it relates to each institution is also being put up on our website. A polite
advisory has also been made to each institution to upload this data on their own website for dissemination to the
public.
5.4 Interpretation and Minor Deviations
Based on issues and pit-falls brought to the notice of Implementation Core Committee (ICC) related to data collection,
verification, authentication and interpretation, the following interpretation and minor deviations were introduced in
the metric and parameters defined in the NIRF:
i) Indian Citation Index was used as source of publications and citations in place of Google Scholar for all
disciplines, since the latter does not support searching of publications by names of institutions.
ii) Collaborative publications were taken only from Web of Science and Scopus. As such, their wieghtages were
correpondingly increased.
iii) Publications provided by individual institutions were not considered because of incomplete, wrong and
misleading bibiographic information supplied by several individual institutions. In any case, we decided to
follow the best international practices in this domain.
iv) Top positions in inter-college and extra curricular events were considered irrespective of whether the events
were at the state and national level or not. This is because many institutions did not care to distinguish
between the various levels of activities.
“A ranking system is needed to capture all facets of education,
teaching, learning and facilities for overall development of
students.”
Anil D Sahasrabudhe, Chairman, AICTE
India Rankings 2016 6
8/18/2019 iifm bhopal
http://slidepdf.com/reader/full/iifm-bhopal 12/41
v) Average publications and patents per faculty for an institute were calculated based on the number of faculty
either required (through a notional FSR of 1:15) or actual faculty numbers in position, which ever was larger,
for calculating metric for publications.
vi) Only university examinations were considered and weightage for public examination was transferred to the
former.
vii) In continuing education and services, training programmes / workshops that were imparted for 6 to 30 dayswith number of participants ranging from 10 to 50 were considered.
5.5 Ranking Thresholds
The following sets of institutions were not considered for ranking:
i) Institutions applicant for ranking in Engineering - Category A whose total undergraduate student approved
intake was less than 200;
ii) Institutions applicant for ranking in Engineering - Category B whose total undergraduate student approved
intake was less than 350;
iii) Architecture Institutions – because of a very limited response from this category of institutions;
iv) General degree colleges – since very few of those that applied were above a basic threshold of performance on
many of the important parameters; and
v) Open universities.
6.1. Engineering (Category A- Research & Teaching)
The IITs (old and new) lead the league tables for Engineering (Category A-Research & Teaching) occupying the first
eleven positions. Predictably, the first five IITs, established during 1950s and early 1960s, take the first five positions
led by IIT Madras the top slot with a weighted score of 89.42 followed by IIT Bombay, IIT Kharagpur, IIT Delhi and IIT
Kanpur with weighted score of 87.67, 83.91, 82.03 and 81.07 respectively. In addition to the five old IITs, five more IITs
secured distinction in terms of weighted score, viz., IIT Roorkee (rank 6; weighted score: 78.68), IIT Hyderabad (rank
20), IIT Jodhpur (rank 25) and IIT Bhubaneswar (rank 26), set-up in 2008-2009, can truly be considered as rising
stars since all these new IITs secured positions within the first thirty slots.
Generally institutions funded by the Central Government directly are heads and shoulders above the others, be they
privately-funded or State Government-funded institutions. While this is clearly on expected lines, it also brings out
the significant gap between these institutions and the others. Clearly there are deeper lessons to learn here. Having
said that, there are interesting new institutions, which are slowly but surely creeping into the top-slots. We should
certainly watch out for them in the years to come. It is also possible to argue that some of the parameters like funding
for library and laboratories, load the dice in their favour, and it would be interesting to see how some of these
institutions would perform on “outputs” and “outcomes” on a per Rupee investment scale.
6.2. Management (Category A – Research and Teaching)
Predictably IIMs hog the top slots amongst the management institutions. The first six ranks are occupied by IIMs led
by IIM Bangalore at 1st rank with a weighted score of 93.04, closely followed by IIM Ahmedabad (rank 2), IIM Calcutta
(rank 3) and IIM Lucknow (rank 4) with weighted scores of 89.92, 87.45 and 86.13 respectively. IIM Udaipur and IIM
Kozhikode are at 5 and 6 positions respectively with weighted score of 84.23 and 81.97 respectively. The remaining
seven IIMs are interspersed between ranks 10 and 28. International Management Institute, New Delhi, a private
institution is at rank 7 with a weighted score of 81.78.
In addition to top seven management institutions mentioned above, seven more management institutions secured
distinction in terms of weighted scores. These institutions are: Indian Institute of Forest Management (rank 8;weighted score: 81.01); Department of Mangement Studies, IIT Kanpur (rank 9; weighted score: 79.44); IIM Indore
(rank 10; weighted score: 78.63); Management Development Institute (rank 11;weighted score: 77.45); International
Management Institute, Kolkata (rank 12; weighted score: 77.17); Xavier Labour Relations Institute (XLRI) (rank 13;
Management Schools of IIT Kanpur (rank 9), Vellore Institute of Technology (rank 17) and Birla Institute of
Technology (rank 25) have also claimed top slots amongst the management institutions. Several private management
institutions also claim top slots in the ranking including: International Management Institute, New Delhi (rank 7),
International Management Institute, Kolkata (rank 12), Xavier Labor Relations Institutions (XLRI) (rank 13),
Thiagarajar School of Management (rank 15) and S. P. Jain Institute of Management and Research (rank 16).
Rising Stars: Indian Institute of Management, Udaipur (rank 5), International Management Institute, Kolkata (rank
12), Indian Institute of Management, Tiruchirappalli (rank 14), Indian Institute of Management, Raipur (rank 18),
Indian Institute of Management Rohtak (rank 19), Rajiv Gandhi Indian Institute of Management (rank 20), Indian
Institute of Management, Kashipur (rank 21) and Indian Institute of Management Ranchi (rank 28), set-up between
2007 and 2011, who have taken positions in the top 30 slots, are amongst the rising stars.
India Rankings 2016 8
8/18/2019 iifm bhopal
http://slidepdf.com/reader/full/iifm-bhopal 14/41
6.3. Pharmacy (Category A - Research and Teaching)
Low scores characterize the league table for Pharmacy (Category A- Research & Teaching). Manipal College of
Pharmaceutical Sciences at rank 1 with a weighted score of 77.87 closely followed by University Institute of
Pharmaceutical Sciences, Chandigarh at 2 rank with a weighted score of 77.10. Jamia Hamdard (rank 3), Poona
College of Pharmacy, Pune (rank 4) and Institute of Pharmacy, Nirma University (rank 5) obtained weighted scores of
71.39, 70.93 and 69.76 respectively.
6.4. Universities
Indian Institute of Science (IISc), Bengaluru is ranked as the first amongst universities with a weighted score of 91.81
closely followed by Institute of Chemical Technology, Mumbai, formerly University Department of Chemical
Technology (UDCT), University of Mumbai, which stands at rank 2 with a weighted score of 87.58. Ranks 3 to 8 are
occupied by the traditional, multi-disciplinary universities led by Jawaharlal Nehru University (rank 3) followed by
University of Hyderabad (rank 4), Tezpur University (rank 5), University of Delhi (rank 6), Banaras Hindu University
(rank 7) and Indian Institute of Space Science and Technology (rank 8) with weighted scores ranging from 86.46 (JNU)
to 78.83 (IISST). Birla Institute of Technology and Sciences (BITS, Pilani) each at 9th position with a weighted score of
76.85 closely followed by Aligarh Muslim University at rank 10 with a weighted score of 76.62.
Visva Bharati (rank 11; weighted score: 76.11) and Panjab University (rank 12; weighted score: 76.07) also secured a
distinction in terms of weighted score while two other universities namely, Pondicherry University (rank 13;
weighted score: 74.44) and Bharathiar University (rank 14; weighted score: 74.32) come very close.
It is noteworthy that King George Medical University, Lucknow (rank 16), Bidhan Chandra Krishi Vishvavidyalaya
(rank 31), Tamil Nadu Veterinary and Animal Science University (rank 36) Tamil Nadu Agricultural University (rank
39) and National Institute of Food Technology Entrepreneurship and Management (rank 50) are discipline-specific
institutions performing research and imparting education in medical, agricultural, veterinary, animal sciences, etc.
could find their places amongst multi-disciplinary universities and technological institutions. This demonstrates thatthe ranking parameters and the metric are generic enough to suit all kinds of educational institutions.
Rankings are based on the information and data provided by the institutions. Data on publications and citations have
been taken from standard indexing and citation sources, viz., Scopus (Elsevier Science), Web of Science (Thomson
Reuters) and the Indian Citation Index. Perception data has been compiled from inputs from general public and peers
NIRF has done limited validation of the data. However, responsibility for the accuracy and authenticity of the data lies
with the concerned institutions supplying it. Data are available on the NIRF portal.
7. Computation of Scores and Rankings: Top 25 in Each Category
India Rankings 2016 9
8/18/2019 iifm bhopal
http://slidepdf.com/reader/full/iifm-bhopal 15/41
Name Est.
Score
Indian Institute of Technology Madras 1959 Tamil Nadu Chennai 89.42 1
Indian Institute of Technology Bombay 1958 Maharashtra Bombay 87.67 2
Indian Institute of Technology Kharagpur 1951 West Bengal Kharagpur 83.91 3
Indian Institute of Technology Delhi 1961 Delhi New Delhi 82.03 4
Indian Institute of Technology Kanpur 1959 Uttar Pradesh Kanpur 81.07 5
Indian Institute of Technology Roorkee 2001 Uttarakhand Roorkee 78.68 6
Indian Institute of Technology Hyderabad 2008 Telangana Hyderabad 77.23 7
Indian Institute of Technology Gandhinagar 2008 Gujarat Ahmedabad 75.21 8
Indian Institute of Technology Ropar 2009 Punjab Rupnagar 74.89 9
Indian Institute of Technology Patna 2008 Bihar Patna 74.68 10
Indian Institute of Technology Guwahati 1994 Assam Guwahati 74.63 11
National Institute of Technology Tiruchirappalli 1964 Tamil Nadu Tiruchirappalli 74.45 12
Vellore Institute of Technology, Vellore 1984 Tamil Nadu Vellore 74.40 13
Indian Institute of Technology (BHU), Varanasi 2012 Uttar Pradesh Varanasi 74.39 14
Sardar Vallabhbhai National Institute of Technology Surat 1961 Gujarat Surat 73.13 15
Indian Institute of Technology Indore 2009 Madhya Pradesh Indore 72.01 16
Birla Institute of Technology, Ranchi 1955 Jharkhand Ranchi 71.81 17
Visvesvarya National Institute of Technology Nagpur 1960 Maharashtra Nagpur 71.29 18
National Institute of Technology Rourkela 1961 Odisha Rourkela 70.81 19
Indian Institute of Technology Mandi 2008 Himachal Pradesh Mandi 70.33 20
College of Engineering, Pune 1854 Maharashtra Pune 69.71 21
National Institute of Technology Karnataka 1960 Karnataka Mangalore 68.96 22
Motilal Nehru National Institute of Technology Allahabad 1961 Uttar Pradesh Allahabad 67.95 23
PSG College of Technology, Coimbatore 1951 Tamil Nadu Coimbatore 67.80 24
Indian Institute of Technology Jodhpur 2008 Rajasthan Jodhpur 67.68 25
State City Weighted Rank
India Rankings 2016 10
7.1. India Rankings 2016: Engineering Category “A”