The Universities Ranking World Cup: A Global View by Continent and Country from the Computer Science Perspective

University rankings are lists of universities ranked according to different criteria: different systems, published annually. This article uses seven of the world's best-known global rankings and only in the context of computer science: Shanghai Ranking’s Academic, CWTS Leiden, Quacquarelli Symonds, Times Higher Education, National Taiwan University, Best Global Universities USNews and University Ranking by Academic Performance. We present the top10 universities by each ranking system and by continent and country those that appear in the top20, top100 and top500. Each of these rankings has different items and weights, which will be listed and analyzed in this article. The results are very different from each other because they follow different systems. This article shows that in top10 there is a direct relationship between the massive presence of top Asian universities and the total dependence on Web of Science publications. The same is not true when the data source is Scopus database.


I. INTRODUCTION
"There is no doubt that the arrival on the scene of global classifications and rankings of universities has galvanized the world of higher education. Since the emergence of global rankings, universities have been unable to avoid national and international comparisons, and this has caused changes in the way universities function" [1].
Students and their families increasingly use university rankings when they want to choose which course to take when they enter higher education. One of the merits of the information society is it's contribute to a conscious choice when it comes to knowing which university is best to attend. Similarly, rankings are a tool used by professors and researchers who want to boost their careers and compete for a position at a university: the reality is that everyone wants to belong to a top institution. With an orderly listing of each institution's position, governments and investors can objectively know who deserves their funding. Everyone can view rankings as they are posted on the internet and are easily accessible. Higher education institutions (HEI) managers are increasingly aware of these rankings and are trying to align internal evaluation and hiring systems with these criteria to make their institution more competitive and more visible. In fact, one can criticize the rankings, saying that they are not fair, that the items and weights adopted are not the correct Manuscript  ones, or that higher education is not a football league that is featured in a list of games, but the reality is that nobody wants to be left out of these lists.
There are different types which rankings that are published annually: some are global, some are subdivided by area, and others list the institutions of a country. This article uses seven of the world's best-known rankings and only in the context of computer science: Shanghai Ranking's Academic Rankings of World Universities (ARWU), CWTS Leiden Rankings (Leiden), Quacquarelli Symonds World University Rankings (QS), and Times Higher Education World University Rankings (THE), Ranking of Scientific Papers for World Universities of National Taiwan University (NTU Ranking), Best Global Universities of USNews (USN) and University Ranking by Academic Performance (URAP). There are other rankings that are well known and appreciated in the world, but were left out because they had criteria that seemed less quantifiable to us. We left out other rankings that did not use areas, as in this case we intended only the area of computer science. National rankings, like the various lists published in the UK by national newspapers, are not used in this article because our goal will be to compare globally by continent and country. Nowadays internationalization has made sense: globalization is part of the life of a citizen of the world.
We present the top10 universities by each ranking system and by continent and country those that appear in the top20, top100 and top500. Each of these rankings has different items and weights, which will be listed and analysed in this article. The results are very different from each other because they follow different systems. Many of the rankings have a huge number of Asian (mainly Chinese) universities in top positions while others consist mostly of American universities. This article compares the lists of computer science universities that appear in the rankings and analyses the criteria for creating each of these rankings.
This article shows that there is a direct relationship between the massive presence of top Asian universities and the total dependence on Web of Science publications. The same is not true when the data source is the Scopus database.
This article began following the many reports about university rankings and their very different results. Those who don't know how rankings are built -namely criteria and weights -don't understand why there are so many differences in the lists. If there is a "fight" between the Asian continent and North America for the early places in these lists, the same is not true for the other places. Interestingly, we will venture in this article to show that Europe has a significant percentage of universities in the top 500 in the world, superior to the Asian continent and North America. In this The Universities Ranking World Cup: A Global View by Continent and Country from the Computer Science Perspective Sónia Rolland Sobral case, there are no reason for the type of research and database used, but it relates to the quantity and seniority of the European universities. This paper is divided into six parts: this introduction; a second section with the methodology explanation; a third section analysing the world rankings and characterizing the seven chosen systems; a fourth section listing the different results; a fifth section analysing these results; a last section with the conclusions and future work.
University rankings are not perfect lists: if they were perfect and indisputable, there would be no need for so many rankings. We all know that since the first rankings were published, the world of higher education has changed because comparisons cannot be avoided and are a picture of what each HEI represents relative to its competitors.

II. METHODOLOGY
Initially we did a study of the existing rankings. Then we looked at articles that compare ranking systems. After a thorough study, we analyzed each of the rankings one by one. As inclusion criteria we only had those that included sub-area of computer science (or similar), with weights and criteria per-established, as well as the rankings that were global. If we were initially fourteen ranking systems, after exclusion we get seven.
We then made a comparative analysis of the items and weights that each of these rankings use to rank universities.
Then we extracted data from each of the seven rankings: We then presented the top10 universities by each ranking system and by continent and country those which had appeared in the top20, top100 and top500. We also made a comparison of some universities in the world and their position in each of the seven rankings used. We found cases where there is a big difference between the positions of higher education institutions in relation to the rankings studied. . Therefore, for our study we only used seven rankings: ARWU, Leiden, NTU, QS, THE, URAP and USNews.

III. GLOBAL RANKINGS
Each system uses different criteria and weights. Next, we will make a short characterization of each of the seven rankings chosen and then a table with then the indicators and weights of each ranking system:

A. Academic Ranking of World Universities (ARWU)
The Academic Ranking of World Universities (ARWU) was first published in June 2003 by the Centre for World-Class Universities (CWCU), Graduate School of Education (formerly the Institute of Higher Education) of Shanghai Jiao Tong University, China, and updated on an annual basis. Since 2009, the Academic Ranking of World Universities (ARWU) has been published and copyrighted by Shanghai Ranking Consultancy. Shanghai Ranking Consultancy is a fully independent organization on higher education intelligence and not legally subordinated to any universities or government agencies. (http://www.shanghairanking.com/) Shanghai Ranking Consultancy began to publishing rankings of world universities by academic subjects in 2009. The first batch of ranked subjects are Mathematics, Physics, Chemistry, Computer Science and Economics/Business [5].
ARWU considers every university that has any Nobel Laureates, Fields Medallists, Highly Cited Researchers, or papers published in Nature or Science. In addition, universities with significant amount of papers indexed by Science Citation Index-Expanded (SCIE) and Social Science Citation Index (SSCI) are also included. In total, more than 1800 universities are actually ranked and the best 1000 are published. Alumni are defined as those who obtain bachelors, masters or doctoral degrees from the institution. Staff is defined as those who work at an institution at the time of winning the prize. The weight is 100% for winners after 2011, 90% for winners in 2001-2010, 80% for winners in 1991-2000, 70% for winners in 1981-1990, and so on, and finally 10% for winners in 1921-1930. Only the primary affiliations of Highly Cited Researchers are considered. To distinguish the order of author affiliation, a weight of 100% is assigned for corresponding author affiliation, 50% for first author affiliation (second author affiliation if the first author affiliation is the same as corresponding author affiliation), 25% for the next author affiliation, and 10% for other author affiliations. The data base used is WebOfScience (http://www.webofscience.com/). This ranking lists the first 50 universities, then the 51-75, 76-100, 101-150, 151-200, 201-300, 3011-400 and 401-500.

B. CWTS Leiden Ranking (Leiden)
The CWTS Leiden Ranking is based on bibliographic data from the Web of Science database produced by Clarivate Analytics (http://www.webofscience.com/). The Leiden Ranking 2019 includes 963 universities from 56 different countries. These are all universities worldwide that have produced at least 1000 Web of Science indexed publications in the period 2014-2017 [6]. Only so-called core publications are counted, which are publications in international scientific journals. In order to be classified as a core publication, a publication must satisfy the following criteria: be written in English, have one or more authors (anonymous publications are not allowed), have not been retracted and have appeared in a core journal. A journal to be considered a core journal must meet the following conditions: the journal must have an international scope, as reflected by the countries in which researchers publishing in the journal and citing to the journal are located; and the journal must have a sufficiently large number of references to other core journals, indicating that the journal is situated in a field that is suitable for citation analysis.
Indicators included in the Leiden Ranking have two variants: a size-dependent and a size-independent variant. In general, size-dependent indicators are obtained by counting the absolute number of publications of a university that have a certain property, while size-independent indicators are obtained by calculating the proportion of the publications of a university with a certain property. For instance, the number of highly cited publications of a university and the number of publications of a university co-authored with other organizations are size-dependent indicators. The Leiden Ranking does not take into account conference proceedings publications and book publications. This is an important limitation in certain research fields. The Leiden Ranking focuses exclusively on the dimension of research performance. The citation impact indicators used in the Leiden Ranking are based on recent data. The Leiden Ranking does not rely on data supplied by the universities themselves, such as data on staff numbers. This ranking lists universities one by one and not by interval (101-150, for example).

C. Quacquarelli Symonds World University Rankings (QS)
The QS World University Rankings [7] is an annual publication of university rankings by Quacquarelli Symonds (QS), previously known as Times Higher Education, QS World University Rankings (2004 to 2009).
QS consider academic reputation based on the Academic Survey (expert opinions of over 94,000 individuals in the higher education space regarding teaching and research quality at the world's universities). The employer reputation metric is based on QS Employer Survey (45,000 employer's responses). Citations per Faculty metric is calculated with the total number of citations received by all papers produced by an institution across a five-year period by the number of faculty members at that institution. All citations data is sourced using Elsevier's Scopus database (https://www.scopus.com/). This ranking lists the top 50 universities, then 51-100, 101-150 and so on.

D. Times Higher Education World University Rankings (THE)
Times Higher Education World University Rankings [8] was founded in 2004, QS separated since 2009.
Times Higher Education World University Rankings uses 13 performance indicators grouped into five areas: The Academic Reputation Survey2018 had more than 20,000 responses. Research income is scaled against academic staff numbers. Research productivity counts the number of publications published in the academic journals indexed by Elsevier's Scopus database (https://www.scopus.com/) per scholar, scaled by institutional size and normalized for subject. The international indicator is calculated by the proportion of a university's total research journal publications that have at least one international co-author Universities can be excluded from the World University Ranking if their research output amounted to fewer than 1,000 relevant publications in five years (with a minimum of 150 a year). Universities can also be excluded if 80 percent or more of their research output is exclusively in one of our 11 subject areas. Institutions provide and sign off their institutional data for use in the rankings.

E. Performance Ranking of Scientific Papers for World Universities, National Taiwan University (NTU)
The Performance Ranking of Scientific Papers for World Universities [9] is released by National Taiwan University, and is known as NTU Ranking. Higher Education Evaluation and Accreditation Council of Taiwan (HEEACT) first published the ranking in 2007 which utilized more objective methods and statistics to rank universities. NTU Ranking provides overall ranking, namely rankings by six fields (Engineering included), and rankings by 24 selected subjects. This ranking system employs bibliometric methods to analyse and rank the scientific paper performances of the world's top 800 universities. Data used to assess the performances of the universities was drawn from ISI's ESI and Web of Science Core Collection (WOS), which includes Science Citation Index (SCI), Social Sciences Citation Index (SSCI) and Journal Citation Reports (JCR).
This ranking lists from one to 298 a set of 305 universities in subject computer science.

F. Best Global Universities for Computer Science, The U.S. News Ranking (USNews)
The overall Best Global Universities ranking [10] encompass the top 1,500 institutions spread across 81 countries, up from 75 countries last year. The first step in producing these rankings, which are powered by Clarivate Analytics InCites, involves creating a pool of 1,599 universities that is used to rank the top 1,500 schools. Best Global Universities for Computer Science ranked 249 universities.

G. University Ranking by Academic Performance (URAP)
URAP is a non-profit organization, which was established at the Informatics Institute of Middle East Technical University in 2009. The most recent rankings include 2500 HEIs around the World as well as 61 different specialized subject area. URAP 2018-2019 World Ranking is based on six academic performance indicators. Since URAP is an academic performance based ranking, publications constitute the basis of the ranking methodology. Both quality and quantity of publications and international research collaboration performance are used as indicators. URAP uses a measure of current scientific productivity which includes articles published in journals that are listed within the first, second and third quartiles in terms of their Journal Impact Factor. International Collaboration used is a measure of global acceptance of a university. International collaboration data, which is based on the total number of articles published in collaboration with foreign universities, is obtained from Clarivate Analytics InCites for the last four years.

H. Summary of Indicators and Weights for Each of the Seven Rankings
The following table lists for each of the seven ranking systems which data source is used (in this case Web of Science or Scopus), which criteria it uses and their weights.   Global research reputation (12.5%) Regional research reputation (12.5%) Publications (10%) Books (2.5%) Conferences (2.5%) Normalized citation impact (10%) Total citations (7.5%) Number of publications that are among the 10% most cited (12.5%) Percentage of total publications that are among the 10% most cited (10%) International collaborationrelative to country (5%) International collaboration (5%) Number of highly cited papers that are among the top 1% most cited in their respective field (5%) Percentage of total publications that are among the top 1% most highly cited papers (5%).

IV. RESULTS
In the next subchapters we will analyse the top 10, top20, top100 and top500 of each of the rankings by continent and country. In the top 10 we will also list the names of each of the top ten universities in each of the seven rankings. The final subchapter of this chapter will summarize the previous subchapters.  The following table lists system the percentage of presence in the top 10 by continent and ranking system. There are rankings which top10 is completely made up of Asian universities (Leiden and NTU), however there are lists that do not feature any top10 Asian university (THE) or just have one top10 Asian university (ARWU and QS). There are rankings (top 10) that list mostly American universities (ARWU 70%, THE 60%). The top 10 which feature a total of universities from the Asian continent do not list any European universities. It can be seen that there are not any universities from Oceania, Africa or Latin America in the top 10. In the following figure, we present the percentage of each continent in the constitution of the top 10 universities in the world in the sub-area of computer science by ranking system. This way it is visually easier to see the differences: the first three rankings (we grouped Leiden and NTU because they both have 100% Asian universities) and the last chart that only has universities in North America and Europe.

B. Top 20 Universities in Global Rankings, Subarea CS
The following table list the percentage of presence in the top 20 by continent and ranking system. There are rankings whose top20 is massively made up of Asian universities (Leiden 95% and NTU 80%), however there are top20 lists with a small percentage of Asian universities (ARWU 15%, QS and THE 20%). The Top 20 which feature with a strong component of universities from the Asian continent, do not list any European universities (Leiden and NTU). It can be seen that there are not any universities from Africa or Latin America in any of the top 20 rankings. For nationals of the countries in question, it is interesting to see the following table showing the presence of each country in each of the ranking systems. We can see from left to right the increasing presence of US universities and the diminishing presence of institutions from China. In addition, the UK only appears in the right-hand rankings, while the reverse happens with Australia, South Korea, Saudi Arabia and Hong Kong.

C. Top 100 Universities in Global Rankings, Subarea CS
The following table lists system the percentage of presence in the top 100 by continent and ranking system. There are rankings whose top100 is massively made up of Asian universities (Leiden 59% and NTU 52, 94%). It can be seen that there are not any universities from Africa in the top 100. Latin American Universities appear in 1% and 3% of the top100 rankings (Leiden and QS, respectively).   The following table lists system the percentage of presence in the top 500 by continent and ranking system. NTU and USNews don't have a ranking with 500 universities. It is very interesting to see that Europe can have the highest percentage of universities in the top500 of the various systems (except Leiden).   Top500  Leiden  QS  URAP  THE Colombia  1  1  Mexico  2  3  1  Canada  21  20  24  22  27  United States  81  91  105  108  119  Australia  16  23  21  23  21  New Zealand  1  7  2 3 2

E. Rankings Summary by Continent
Africa only appears in the top500: Leiden, QS and THE 1%, ARWU e URAP 0,2%. None of the universities appear in the top 500 of all rankings. The best ranked is the University of Cairo in Egypt which appears in 151-200 in the QS ranking. Regarding Latin America, Brazil has several top universities, and three of these universities are persistent in all rankings: Universidade de Sao Paulo, Universidade Estadual de Campinas e Universidade Federal de Minas Gerais.

V. RESULTS ANALYSIS
Thus, each of the universities has a very different place in each of the rankings. In the following table we list the places of five universities. For example, Tsinghua University of China is in 1st place in Leiden, NTU, URAP and USNews, but "only" in 7th in ARWU and 15th in QS and THE.
The following table shows the relationship between the dependence on the publications criteria and the percentage of Asian and American universities in the top10, considering source as Web of Science: the higher the first, the higher the latter. The cases of the Leiden and NTU systems are pragmatic: the weight of WoS publications is 100%, so there are 100% Asian universities in the top 10. URAP features 70% top Asian universities despite 100% dependence on WoS publications. The reason may be the years to which the data relate: in the case of URAP 21% refers to 2017 and 79% to the years 2013-2017. In the case of Leiden the publications refer to the years 2014-2017. In the case of NTU, 50% refers to 2008-2018, 35% to 2017-2018 and 15% to 2018.  . Fig. 6. Source, weight of publications and percentage of Asian and USA universities in the top10.
Interestingly, the top500 is much divided between the universities of Europe, North America and Asia.

VI. CONCLUSION
We started by studying the 14 rankings and then extracted seven. These seven were chosen because they have the computer sciences subarea, are global and are not a ranking that each user can parameterize. The seven chosen were Shanghai Ranking of Academic Rankings of World Universities (ARWU), CWTS Leiden Rankings (Leiden), Quacquarelli Symonds World University Rankings (QS), Higher Education World Rankings University (THE), the National Taiwan University Performance Ranking of Scientific Papers for World Universities (NTU Ranking), the USNews (USNews) Best Global Universities and University Ranking by Academic Performance (URAP).
We studied the criteria and weights used by each of the systems. Finally we listed the top 10, top20, top100 and top500 of each of the seven rankings, the percentage of each continent in the top 10 and some well-known universities and their position in each of the rankings.
We find that when there is a heavy reliance on the WoS database, the top 10 tend to be Asian universities. The same is not true when there is a dependence (even small) on the Scopus database.
However it turns out that the top500 is much divided between the universities of Europe, North America and Asia. In a next study we will try to see how the ranking is constituted if we use both databases and eventually another that was not used for any of these seven chosen rankings.

CONFLICT OF INTEREST
This study was carried out without a conflict of interest.