What is behind bibliometric indicators from the Web of Science?

This text looks into the figures obtained through the Web of Science and a way to understand them in analyzing Croatian, Serbian and Slovenian research productivity.

Measuring research performance and ranking research organisations accordingly has become an often visited topic for researchers, analysts, research organisations themselves, as well as the public at large. The scholarly attention given to this topic is particularly noted with those who are interested in science as such, as well as those interested in economics and organisations, and in particular higher education institutions, more than ever so in the aftermath of the launch of the first Academic Ranking of World Universities (ARWU) ranking in 2003 or the Times QS World University Ranking in 2004. Even though it is likely that rankings have had an impact on general understanding of research performance and measurement, it is not our aim here to go into the issue of university rankings, as this would take us beyond the scope of here presented research, and it has been extensively discussed elsewhere (Dill & Soo, 2005; van Raan, 2005; Shin, Toutkoushian, & Teichler, 2011). Yet it is our aim to take a closer look at research performance in three countries under study, in order to have a better understanding of the research systems observed.

Evaluating research performance became an increasingly debated topic both among those who are supposed to undergo evaluation in the context, those who provide resources to research providers, as well as those research and analyze science and research evaluation. In scholarly literature, the state’s embracement of this trend is often part of a larger one coined ‘the Rise of the Evaluative State’ (Dill, 1998; G. Neave, 1998) and it is closely related to the occurrence of yet another term, New Public Management (shorter NPM) which, simply put, refers to the adoption of the private sector management mechanisms by the public sector and shift from a input-based system to the one which relies on the output of a process. In the context of higher education research, a more elaborate definition could be of use, by which NPM refers to “the introduction of strategic planning, the setting in place of mechanisms and procedures for institutional self-assessment and the elaboration of more sophisticated indicators of cost control, performance evaluation, the paraphernalia for estimating academic productivity and institutional efficiency” (Groof, G. R. Neave, & Švec, 1998, p. 59).

In general, evaluation can be conducted during a process or once the process is over, i.e. it can be formative or summative and it can be based on qualitative or quantitative methods. Moreover, in line with the trend outlined in the paragraph above, governments often use research evaluation to enhance their policies and justify modes of funding allocation. However, some cases point out that governments adopt evaluative measures towards research organizations for reasons such as ‘“good housekeeping” of research institutions, rather than as a basis for allocating research funds or assessing goal achievement’ (OECD, 1997, p. 30). Nevertheless, the instrumentality of research evaluation or even research output measurement is undisputed in the process of strategic thinking, policy making and designing allocation mechanisms for public funding. The question which, however, poses much controversy is how research and research organizations are actually evaluated and by whom. The latter can be done either by those who engage in research themselves or their peers, or externally, such as evaluation agencies or government officials. As for the former, the question often boils down to performance indicators and the debate on how these are in effect constructed or which of them are better in indicating different aspects of research performance. Finally, as regards the object of evaluation, the method can be applied to an individual researcher, research work or project, research organisation, or an entire research system. In the case of the European Union, evaluation can also be done for the aggregate of 27 countries (and more). As expected, along all these lines, the reasoning behind research evaluation and evaluation practices adopted by countries vary (OECD, 1997).

Bibliographic data is one of the several ways of looking into research performance of a country (other ways include e.g. number of collaborative projects, citation reports, number of prizes and medals received, patents applied for, etc.). Even though the usage of bibliometric methods of measuring research performance in the name of institutional and country comparison is widely discussed and disputed, both by scholars studying research and science and those directly engaged in day-to-day research work (e.g. van Raan, 2005), it still remains the most commonly used in measuring research performance.
One of the sources of data indicative of research performance have come to our attention in the process was the Web of Science (WoS) of the Institute for Scientific Information (ISI) Web of Knowledge.

Web of Science is Thomson Reuters’ online academic citation index, designed to search through a number of databases containing information from thousands of academic journals, books, book series, reports, conferences, and other.  There are in total seven databases, three of which are most commonly referred and analysed, namely, SCI-Expanded, SSCI and A&HCI. In addition, two databases include conference proceedings and these refer to the literature published within the scope of a recognised conference, symposium, etc., that is, CPCI-S and CPCI-SSH. Finally, the Web of Science includes two chemistry databases, designed to search for chemical compounds and reactions: Index Chemicus (IC) and Current Chemical Reactions (CCR-Expanded).

For the purpose of this chapter we have conducted a small-scale research of publications coming from authors who, when submitting their publications, reported to be geographically located in Croatia, Serbia or Slovenia, starting from year 2000. Yet as Serbia was until 2006 part of first FR Yugoslavia and then Serbia and Montenegro (together with Montenegro, population about 620.000, roughly ten times smaller than Serbia), we also included calculations for Yugoslavia and Serbia and Montenegro until 2006. We have run queries covering SCI-Expanded, SSCI and A&HCI, as well as CPCI-S and CPCI-SSH for the period between 2000 and the present (mid July 2011).

With regards to the reliability of the data collected from the Web of Science, it must be stated that errors are not rare and they can be caused by a number of factors. Concerning to citation counts, Glänzel et al. (2003) distinguish between four main causes, that is, database producer, publication author, journal editor and the user of the bibliographic database. As we have witnessed during our research, these are of relevance in acquiring data from the database also for other than citation count purposes. For instance, when it comes to the names of institutions behind a publication or the funding source it often happens that one institution appears several times under different names. Slovenian Research Agency can come as ARRS, SRA, Slovenian Agency or in some other form. University of Zagreb can also be Univ Zagreb, Sveuciliste Zagreb, and so on. Or, an organisational unit legally part of a university can sometimes be listed by its own name and in order to ascribe its publications to a particular university, one needs to be aware of the relationship. Therefore, it is important to state that we have taken as much precaution as possible to correct this kind of errors and that no severe miscalculations have taken place. For the sake of precaution, the percentages given here represent the minimum for a certain category of data, i.e. the percentage could be higher but it is highly unlikely that it is lower.

Table 1 and its accompanying Figure 1 below show the overall increase in the number of publications in respective databases over the course of 11 years (with the exception of 2011, which is not over at the point of writing these pages). The obvious exception to this are Yugoslavia and Serbia and Montenegro, clearly from 2004 till 2006, during which period Serbia’s contribution skyrocketed, compared to the 2000-2004 period. Here, we guess that at some point a considerable number of researchers started submitting their works as coming from Serbia instead of Yugoslavia, which can also be inferred from the clear course taken by Serbia, once Montenegro became independent in 2006. The overall number of Montenegrin publications is therefore added to the overall picture.

  2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010
Croatia 1776 1759 2070 2171 2411 2904 2851 3497 4360 4483 4155
Serbia 35 49 50 66 245 1981 2599 3545 4099 4747 4719
Slovenia 2030 1996 2223 2427 2431 2806 2815 3576 4074 4053 3923
Serbia and Montenegro 32 45 49 60 131 1851 927   4 2  
Montenegro         4 2 72 100 131 155 172
Yugoslavia 1442 1330 1612 1662 1916 582 7        

Table 1 Web of Science published works in SCI-EXPANDED, SSCI, A&HCI, CPCI-S, CPCI-SSH databases by authors from Croatia, Serbia, Slovenia, Montenegro and Yugoslavia, 2000 – 2010. Retrieved from WoS in July 2011.

 

For the sake of clarity and due to the relatively modest contribution of Montenegro during the post-2000 Yugoslavia (Table 1), Figure 1 does not include data for Montenegro, nor for Serbia and Montenegro.

Figure 1 Web of Science published works in SCI-EXPANDED, SSCI, A&HCI, CPCI-S, CPCI-SSH databases by authors from Croatia, Serbia, Slovenia and Yugoslavia, 2000 – 2010.

 

Two points are worth mentioning here. First, the apparent increase in research publications should not be taken as a direct indicator of the increase in the bibliographic productivity of scientists, due to the changes in the WoS journal coverage throughout the period, as it will be illustrated in the paragraph immediately following.

Figure 2 Journal coverage on the Web of Science in 2005 and 2010 for Croatia, Serbia and Slovenia

Second, in an essay titled ‘The Globalization of the Web of Science’ published by Thomson Reuters in June 2011, Croatia is listed among the 14 countries which added 40 or more journals to Web of Science between 2005 and 2010 (Testa, 2011). By the end of 2010, 61 Croatian journals were listed on the WoS. Similarly, back in 2005, Slovenia had only 6 journals indexed, only to add 20 more by 2010. In the same fashion, Serbia had no journal coverage back in 2005 and by 2010 it had 23. However, as stated in the same source, ‘four journals now published in Serbia, for example, were covered in Web of Science before 2005 but originated from different countries (3 were formerly from Yugoslavia and one was published in Germany)’ (Testa, 2011, p. 4). As noted some paragraphs above, these journals are the ones in which considerable percentage of the publications on the Web of Science attributed to the country in question are located. In this sense, what we see in Figure 2 can be taken as an indicator of international visibility of the national scientific activity, which is, as shown, on the rise, rather than as an indicator of increased productivity.

Third, with respect to the journal in which articles are published, domestic journals are among the top popular for submitting papers to. In 2010, out of 25 top journals by number of publications from Croatian authors, there were at least 18 registered in Croatia, taking up to about 22% of all Croatian publications in the Web of Science registered journals. In the same period, in the case of Slovenia this ratio was 14 out of 25, or about 12% of total Slovenian publication number on WoS. Serbia, on the other hand, had only 7 out of to 25 most popular journals as registered on its own territory, but still covering about 13%, similar to Slovenia. Despite this apparent high concentration of publication in domestic journals, the majority of publications still go to foreign journals and to a diverse group of them. This comes as no surprise given the increase in the journal coverage for the three countries between 2005 and 2010. In other words, Thomson Reuters’ databases are tied to the journals which are indexed on the Web of Science. Therefore, a logical step would be to look into the changes which occurred on the WoS with regards to the journal indexed.

In sum, in all three countries there appears to be an upwards trend in publishing, both when looked through the lens of national statistics offices and the Web of Science. This trend was steeper in the case of Serbia in the previous decade, most likely due to the later political and economic stabilisation of the country. Currently, all the countries seem to be at more or less the same level in terms of publication numbers, when it comes to this source. However, it has been demonstrated that the number of publications in the Web of Science databases is not indicative of the productivity, but rather of international visibility of national scientific activity in terms of publications, as what certainly increases is the number of domestic journals, which are the most popular destinations of publishing for domestic authors in all three countries. With the aim of determining whether productivity has indeed increased, further research could, for instance, look into whether the number of publications from these countries’ authors has increased in foreign journals in the meantime.

Author: Jelena Branković

Centre for Education Policy

References

Dill, D. D. (1998). Evaluating the “Evaluative State”: Implications for Research in Higher Education. European Journal of Education, 33(3), 361-377.

Dill, D. D., & Soo, M. (2005). Academic quality, league tables, and public policy: A cross-national analysis of university ranking systems. Higher Education, 49(4), 495-533.

Glänzel, W., Schlemmer, B., & Thijs, B. (2003). Better late than never? On the chance to become highly cited only beyond the standard bibliometric time horizon. Scientometrics, 58(3), 571–586.

Groof, J. de, Neave, G. R., & Švec, J. (1998). Democracy and governance in higher education. Martinus Nijhoff Publishers.

Neave, G. (1998). The Evaluative State Reconsidered. European Journal of Education, 33(3), 265-284.

OECD. (1997). The Evaluation of Scientific Research: Selected Experiences. Paris: OECD.

Shin, J. C., Toutkoushian, R. K., & Teichler, U. (Eds.). (2011). University Rankings. Dordrecht: Springer Netherlands.

Testa, J. (2011). The Globalization of Web of Science: 2005-2010. Thomson Reuters. Retrieved from http://wokinfo.com/media/pdf/globalwos-essay.pdf

van Raan, A. F. J. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133-143. Shin, J.

 

NOTE:

The text is a part of the forthcoming CEP publication: Brankovic, J. & Šabić, N. (eds.) Research Policy, Financing and Performance. Croatia, Serbia and Slovenia in comparative perspective. Belgrade: Centre for Education Policy

Share

Mailing List

To stay informed about the project, please subscribe to our mailing list.

Contact

Email us at info@herdata.org or visit cep.edu.rs

Project partners

  • University of OsloUniversity of Oslo
  • University of ZagrebUniversity of Zagreb
  • Faculty of Political SciencesFaculty of Political Sciences
  • Centre For Education PolicyCentre For Education Policy
  • Nordic Institute for Studies in Innovation, Research and EducationNordic Institute for Studies in Innovation, Research and Education

All rights reserved by HERDATA, unless otherwise specified.