Times Higher Education World University Rankings - Wikipedia Jump to content

Edit links
From Wikipedia, the free encyclopedia

Times Higher Education World University Rankings
EditorPhil Baty
CategoriesHigher education
FrequencyAnnual
PublisherTimes Higher Education
First issue2004; 19 years ago (2004) (in partnership with QS)
2010; 13 years ago (2010) (on its own)
CountryUnited Kingdom
LanguageEnglish
Websitewww.timeshighereducation.com/world-university-rankings/

The Times Higher Education World University Rankings, often referred to as the THE Rankings or just THE, is the annual publication of university rankings by the Times Higher Education magazine. The publisher had collaborated with Quacquarelli Symonds (QS) to publish the joint THE-QS World University Rankings from 2004 to 2009 before it turned to Thomson Reuters for a new ranking system from 2010 to 2013. In 2014, the magazine signed an agreement with Elsevier to provide it with the data used in compiling its annual rankings.

The publication includes global rankings of universities, including by subject and reputation. It also has begun publishing three regional tables for universities in Asia, Latin America, and BRICS and emerging economies, which are ranked with separate criteria and weightings.

The THE Rankings is often considered one of the most widely observed university rankings together with the Academic Ranking of World Universities, the QS World University Rankings, and others. It is praised for having a new, improved ranking methodology since 2010, but criticism and concerns have been voiced that this methodology underestimates non-science and non-English instructing institutions and relies on a subjective reputation survey.

History

The creation of the original Times Higher Education–QS World University Rankings has been to John O'Leary, a former editor of High Times. Times Higher Education chose to partner with educational and careers advice company QS to supply the data.

After the 2009 rankings, Times Higher Education took the decision to break from QS and signed an agreement with Thomson Reuters to provide the data for its annual World University Rankings from 2010 onwards. The publication developed a new rankings methodology in consultation with its readers, its editorial board and Thomson Reuters. Thomson Reuters will collect and analyse the data used to produce the rankings on behalf of Times Higher Education. The first ranking was published in September 2010.

Commenting on Times Higher Education's decision to split from QS, former editor Ann Mroz said, "universities deserve a rigorous, robust and transparent set of rankings – a serious tool for the sector, not just an annual curiosity." She went on to explain the reason behind the decision to continue to produce rankings without QS' involvement, saying that: "The responsibility weighs heavy on our shoulders...we feel we have a duty to improve how we compile them."

Phil Baty, editor of the new Times Higher Education World University Rankings, admitted in Inside Higher Ed, "The rankings of the world's top universities that my magazine has been publishing for the past six years, and which have attracted enormous global attention, are not good enough. In fact, the surveys of reputation, which made up 40 percent of scores and which Times Higher Education until recently defended, had serious weaknesses. And it's clear that our research measures favored the sciences over the humanities."

He went on to describe previous attempts at peer review as "embarrassing" in The Australian: "The sample was simply too small, and the weighting too high, to be taken seriously." THE published its first rankings using its new methodology on 16 September 2010, a month earlier than previous years.

In 2010, Times Higher Education World University Rankings, along with the QS World University Rankings and the Academic Ranking of World Universities, were described to be the three most influential international university rankings. The Globe and Mail in that year also described the Times Higher Education World University Rankings to be "arguably the most influential."

In 2014 Times Higher Education announced a series of important changes to its flagship THE World University Rankings and its suite of global university performance analyses, following a strategic review by THE parent company TES Global.

Methodology

Criteria and Weighting

The inaugural 2010–2011 methodology contained 13 separate indicators grouped under five categories: Teaching (30 per cent of final score), research (30 per cent), citations (research impact) (worth 32.5 per cent), international mix (5 per cent), industry income (2.5 per cent). The number of indicators is up from the Times-QS rankings published between 2004 and 2009, which used six indicators.

A draft of the inaugural methodology was released on 3 June 2010. The draft stated that 13 indicators would first be used and that this could rise to 16 in future rankings, and laid out the categories of indicators as "research indicators" (55 per cent), "institutional indicators" (25 per cent), "economic activity/innovation" (10 per cent), and "international diversity" (10 per cent). The names of the categories and the weighting of each was modified in the final methodology, released on 16 September 2010 The final methodology also included the weighting assigned to each of the 13 indicators, shown below (with some updates from 2022–23 released methodology ):

Overall indicatorIndividual indicatorPercentage weighting
Industry Income – innovation
  • Research income from industry (per academic staff)
  • 2.5%
International diversity (currently: International outlook (staff, students, research))
  • Ratio of international to domestic staff
  • Ratio of international to domestic students
  • International Collaboration (as for 2022–23)
  • 3% (2.5% as for 2022–23)
  • 2% (2.5% as for 2022–23)
  • 2.5% (as for 2022–23)
Teaching – the learning environment
  • Reputational survey (teaching)
  • PhDs awards per academic
  • Undergrad. admitted per academic
  • Income per academic
  • PhDs/undergraduate degrees awarded
  • 15%
  • 6%
  • 4.5%
  • 2.25%
  • 2.25%
Research – volume, income and reputation
  • Reputational survey (research)
  • Research income (scaled)
  • Papers per research and academic staff
  • Public research income/ total research income (currently cancelled)
  • 19.5% (18% as for 2022–23)
  • 5.25% (6% as for 2022–23)
  • 4.5% (6% as for 2022–23)
  • /(currently cancelled, previously weighted 0.75%)
Citations – research influence
  • Citation impact (normalised average citation per paper)
  • 32.5% (30% as for 2022–23)

The Times Higher Education billed the methodology as "robust, transparent and sophisticated," stating that the final methodology was selected after considering 10 months of "detailed consultation with leading experts in global higher education," 250 pages of feedback from "50 senior figures across every continent" and 300 postings on its website. The overall ranking score was calculated by making Z-scores all datasets to standardize different data types on a common scale to better make comparisons among data.

The reputational component of the rankings (34.5 per cent of the overall score – 15 per cent for teaching and 19.5 per cent for research) came from an Academic Reputation Survey conducted by Thomson Reuters in spring 2010. The survey gathered 13,388 responses among scholars who, according to THE, were "statistically representative of global higher education's geographical and subject mix." However, the response rate of the survey in 2022 was a mere 1.8 %. The magazine's category for "industry income – innovation" came from a sole indicator, institution's research income from industry scaled against the number of academic staff." The magazine stated that it used this data as "proxy for high-quality knowledge transfer" and planned to add more indicators for the category in future years.

Data for citation impact (measured as a normalized average citation per paper), comprising 32.5 per cent of the overall score, came from 12,000 academic journals indexed by Thomson Reuters' Web of Science database over the five years from 2004 to 2008. The Times stated that articles published in 2009–2010 have not yet completely accumulated in the database. The normalization of the data differed from the previous rankings system and is intended to "reflect variations in citation volume between different subject areas," so that institutions with high levels of research activity in the life sciences and other areas with high citation counts will not have an unfair advantage over institutions with high levels of research activity in the social sciences, which tend to use fewer citations on average.

The magazine announced on 5 September 2011 that its 2011–2012 World University Rankings would be published on 6 October 2011. At the same time, the magazine revealed changes to the ranking formula that will be introduced with the new rankings. The methodology will continue to use 13 indicators across five broad categories and will keep its "fundamental foundations," but with some changes. Teaching and research will each remain 30 per cent of the overall score, and industry income will remain at 2.5 per cent. However, a new "international outlook – staff, students and research" will be introduced and will make up 7.5 per cent of the final score. This category will include the proportion of international staff and students at each institution (included in the 2011–2012 ranking under the category of "international diversity"), but will also add the proportion of research papers published by each institution that are co-authored with at least one international partner. One 2011–2012 indicator, the institution's public research income, will be dropped.

On 13 September 2011, the Times Higher Education announced that its 2011–2012 list will only rank the top 200 institutions. Phil Baty wrote that this was in the "interests of fairness," because "the lower down the tables you go, the more the data bunch up and the less meaningful the differentials between institutions become." However, Baty wrote that the rankings would include 200 institutions that fall immediately outside the official top 200 according to its data and methodology, but this "best of the rest" list from 201 to 400 would be unranked and listed alphabetically. Baty wrote that the magazine intentionally only ranks around 1 per cent of the world's universities in a recognition that "not every university should aspire to be one of the global research elite." However, the 2015/16 edition of the Times Higher Education World University Rankings ranks 800 universities, while Phil Baty announced that the 2016/17 edition, to be released on 21 September 2016, will rank "980 universities from 79 countries".

The methodology of the rankings was changed during the 2011–12 rankings process, with details of the changed methodology here. Phil Baty, the rankings editor, has said that the THE World University Rankings are the only global university rankings to examine a university's teaching environment, as others focus purely on research. Baty has also written that the THE World University Rankings are the only rankings to put arts and humanities and social sciences research on an equal footing to the sciences. However, this claim is no longer true. In 2015, QS introduced faculty area normalization to their QS World University Rankings, ensuring that citations data was weighted in a way that prevented universities specializing in the Life Sciences and Engineering from receiving undue advantage.

In November 2014, the magazine announced further reforms to the methodology after a review by parent company TES Global. The major change being all institutional data collection would be bought in house severing the connection with Thomson Reuters. In addition, research publication data would now be sourced from Elsevier's Scopus database.

Reception

The reception to the methodology was varied.

Ross Williams of the Melbourne Institute, commenting on the 2010–2011 draft, stated that the proposed methodology would favour more focused "science-based institutions with relatively few undergraduates" at the expense of institutions with more comprehensive programmes and undergraduates, but also stated that the indicators were "academically robust" overall and that the use of scaled measures would reward productivity rather than overall influence. Steve Smith, president of Universities UK, praised the new methodology as being "less heavily weighted towards subjective assessments of reputation and uses more robust citation measures," which "bolsters confidence in the evaluation method." David Willetts, British Minister of State for Universities and Science praised the rankings, noting that "reputation counts for less this time, and the weight accorded to quality in teaching and learning is greater." In 2014, David Willetts became chair of the TES Global Advisory Board, responsible for providing strategic advice to Times Higher Education.

Criticism

Times Higher Education places a high importance on citations to generate rankings. Citations as a metric for effective education is problematic in many ways, placing universities who do not use English as their primary language at a disadvantage. Because English has been adopted as the international language for most academic societies and journals, citations and publications in a language different from English are harder to come across. Thus, such a methodology is criticized for being inappropriate and not comprehensive enough. A second important disadvantage for universities of non-English tradition is that within the disciplines of social sciences and humanities the main tool for publications are books which are not or only rarely covered by digital citations records.

Times Higher Education has also been criticized for its strong bias towards institutions that taught 'hard science' and had high quality output of research in these fields, often to the disadvantage of institutions focused on other subjects like the social sciences and humanities. For instance in the former THE-QS World University Rankings, the London School of Economics (LSE) was ranked 11th in the world in 2004 and 2005, but dropped to 66th and 67th in the 2008 and 2009 edition. In January 2010, THE concluded the method employed by Quacquarelli Symonds, who conducted the survey on their behalf, was flawed in such a way that bias was introduced against certain institutions, including LSE.

A representative of Thomson Reuters, THE's new partner, commented on the controversy: "LSE stood at only 67th in the last Times Higher Education-QS World University Rankings – some mistake surely? Yes, and quite a big one." Nonetheless, after the change of data provider to Thomson Reuters the following year, LSE fell to 86th place, with the ranking described by a representative of Thomson Reuters as 'a fair reflection of their status as a world class university'. LSE despite being ranked continuously near the top in its national rankings, has been placed below other British universities in the Times Higher Education World Rankings in recent years, other institutions such as Sciences Po have suffered due to the inherent methodology bias still used. Trinity College Dublin's ranking in 2015 and 2016 was lowered by a basic mistake in data it had submitted; education administrator Bahram Bekhradnia said the fact this went unnoticed evinced a "very limited checking of data" "on the part of those who carry out such rankings". Bekhradnia also opined "while Trinity College was a respected university which could be relied upon to provide honest data, unfortunately that was not the case with all universities worldwide."

In general it is not clear who the rankings are made for. Many students, especially the undergraduate students, are not interested in the scientific work of a facility of higher education. Also the price of the education has no effects on the ranking. That means that private universities on the North American continent are compared to the European universities. Many European countries like France, Sweden or Germany for example have a long tradition on offering free education within facilities of higher education.

In 2021, the University of Tsukuba in Ibaraki Prefecture, Japan, was alleged to have submitted falsified data on the number of international students enrolled at the university to the Times Higher Education World University Rankings. The discovery resulted in an investigation by THE and the provision of guidance to the university on the submission of data, however, it also led to the criticism amongst faculty members of the ease with which THE's ranking system could be abused. The matter was discussed in Japan's National Diet on April 21, 2021.

Seven Indian Institutes of Technology (Mumbai, Delhi, Kanpur, Guwahati, Madras, Roorkee and Kharagpur) have boycotted THE rankings from 2020. These IITs have not participated in the rankings citing concerns over transparency.

World Rankings

Times Higher Education World University Rankings—Top 10
Institution202320222021202020192018201720162015201420132012
United Kingdom University of Oxford111111123224
United States Harvard University223766662242
United Kingdom University of Cambridge356322445776
United States Stanford University342433334432
United States Massachusetts Institute of Technology555545556557
United States California Institute of Technology624253211111
United States Princeton University779677777665
United States University of California, Berkeley887131518101388910
United States Yale University998881212129111111
United Kingdom Imperial College London10121110988891088

Young universities

Additionally, Times Higher Education provides a THE Universities Under 50 list (formerly only 150 Under 50 Universities) with different weightings of indicators to accredit the growth of higher education institutions that are under 50 years old. In particular, the ranking attaches less weight to reputation indicators: for instance, the University of Canberra, established in 1990, is placed at the 17th position, while the Paris Sciences et Lettres University (2010) is ranked 1st in 2022.

Subject

Various academic disciplines are sorted into six categories in THE's subject rankings: "Arts & Humanities"; "Clinical, Pre-clinical & Health"; "Engineering & Technology"; "Life Sciences"; "Physical Sciences"; and "Social Sciences".

World Reputation Rankings

Regions with universities included in the reputation league tables.

THE's World Reputation Rankings serve as a subsidiary of the overall league tables and rank universities independently in accordance with their scores in prestige.

Scott Jaschik of Inside Higher Ed said of the new rankings: "...Most outfits that do rankings get criticised for the relative weight given to reputation as opposed to objective measures. While Times Higher Education does overall rankings that combine various factors, it is today releasing rankings that can't be criticised for being unclear about the impact of reputation – as they are strictly of reputation."

Times Higher Education World Reputation Rankings—Top 10
Institution202220212020201920182017201620152014201320122011
United States Harvard University111111111111
United States Massachusetts Institute of Technology222222242222
United States Stanford University343333353645
United Kingdom University of Oxford435554535466
United Kingdom University of Cambridge554444424333
United States University of California, Berkeley666666666554
United States Princeton University777777777777
United States Yale University88888888810109
China Tsinghua University91013141414182636353035
Japan University of Tokyo101310111311121211988

Regional rankings

Asia

From 2013 to 2015, the outcomes of the Times Higher Education Asia University Rankings were the same as the Asian universities' position on its World University Rankings. In 2016, the Asia University Rankings was revamped and it "use the same 13 performance indicators as the THE World University Rankings, but have been recalibrated to reflect the attributes of Asia's institutions."

Times Higher Education Asia University Rankings as shown below – Top 10
Institution20232022202120202019201820172016201520142013
China Tsinghua University11111235566
China Peking University22225322454
Singapore National University of Singapore33332111222
Hong Kong University of Hong Kong44444454333
Singapore Nanyang Technological University55566542101111
Hong Kong Chinese University of Hong Kong6778771113131212
Hong Kong Hong Kong University of Science and Technology79853566799
Japan University of Tokyo86678877111
China Fudan University910111717161619242524
China Shanghai Jiao Tong University913161924=2018=32394740

Emerging economies

The Times Higher Education Emerging Economies Rankings (Formerly known as BRICS & Emerging Economies Rankings) only includes universities in countries classified as "emerging economies" by FTSE Group, including the "BRICS" nations of Brazil, Russia, India, China and South Africa. Hong Kong institutions are not included in this ranking.

Times Higher Education BRICS & Emerging Economies Rankings – Top 10
Institution202220212020201920182017201620152014
China Peking University122211111
China Tsinghua University211122222
China Zhejiang University33336982122
China Fudan University4476461798
China Shanghai Jiao Tong University55687771627
Russia Lomonosov Moscow State University6655333510
China University of Science and Technology of China7744557116
China Nanjing University8997811142218
Taiwan National Taiwan University988101010564
Russia Moscow Institute of Physics and Technology1011121211129369

Latest Articles

Popular Articles