Classifying and visualizing the disciplinary focus of universities: The invisible factor of university rankings
Nicolas Robinson-Garcia. PhD thesis. Granada: Universidad de Granada, 2014.
Here we analyze the phenomenon of university rankings and how they affect on the assessment of national higher education systems. The thesis poses the following research questions: 1) Are national rankings neccessary in an international context?, and 2) how can we provide research policy makers with tools that allow them to analyze different university profiles according to their disciplinary specialization? For this we present five publications in which we answer these research questions. While the first one deals with the role of national university rankings, the other four deal with the problems derived from using rankings to analyze different university profiles. They also offer visualization techniques and science mapping solutions to overcome such problems.
An insight into the importance of national university rankings in an international context: The case of the I-UGR Rankings of Spanish universities
Robinson-Garcia, N., Torres-Salinas, D., Delgado López-Cózar, E., Herrera, F. Scientometrics, 101(2), 1309-1324. doi:10.1007/s11192-014-1263-1
The great importance international rankings have achieved in the research policy arena warns against many threats consequence of the flaws and shortcomings these tools present. One of them has to do with the inability to accurately represent national university systems as their original purpose is only to rank world-class universities. Another one has to do with the lack of representativeness of universities’ disciplinary profiles as they usually provide a unique table. Although some rankings offer a great coverage and others offer league tables by fields, no international ranking does both. In order to surpass such limitation from a research policy viewpoint, this paper analyzes the possibility of using national rankings in order to complement international rankings. For this, we analyze the Spanish university system as a study case presenting the I-UGR Rankings for Spanish universities by fields and subfields. Then, we compare their results with those obtained by the Shanghai Ranking, the QS Ranking, the Leiden Ranking and the NTU Ranking, as they all have basic common grounds which allow such comparison. We conclude that it is advisable to use national rankings in order to complement international rankings, however we observe that this must be done with certain caution as they differ on the methodology employed as well as on the construction of the fields.
What do rankings by fields rank? Exploring discrepancies between the organizational structure of universities and bibliometric classifications.
Robinson-Garcia, N.; Calero-Medina, C. Scientometrics, 98(3), 1955-1970. doi:10.1007/s11192-013-1157-7.
University rankings by fields are usually based on the research output of universities. However, research managers and rankings consumers expect to see in such fields a reflection of the structure of their own organizational institution. In this study we address such misinterpretation by developing the research profile of the organizational units of two Spanish universities: University of Granada and Pompeu Fabra University. We use two classification systems, the subject categories offered by Thomson Scientific which are commonly used on bibliometric studies, and the 37 disciplines displayed by the Spanish I-UGR Rankings which are constructed from an aggregation of the former. We also describe in detail problems encountered when working with address data from a top down approach and we show differences between universities structures derived from the interdisciplinary organizational forms of new managerialism at universities. We conclude by highlighting that rankings by fields should clearly state the methodology for the construction of such fields. We indicate that the construction of research profiles may be a good solution for universities for finding out levels of discrepancy between organizational units and subject fields.
Análisis de redes de las universidades españolas de acuerdo a su perfil de publicación en revistas por áreas científicas.
Robinson-Garcia, N.; Rodriguez-Sánchez, R.; García, J.A.; Torres-Salinas, D.; Fdez-Valdivia, J. Revista Española de Documentación Científica, 36(4), e027. doi:10.3989/redc.2013.4.1042.
This study presents a descriptive analysis of the Spanish universities according to their journal publication profile in five scientific domains in the 2207-2011 time period. Two universities have a similar journal publication profile if they publish in a high number of common journals. Following this idea it is possible to map universities offering an enriching view of the Spanish higher education system.In order to do so, we analyze the Social Sciences, Exact Sciences, Engineering & Technology, Life Sciences and Health Sciences areas. Also, we use the centrality indicator of social network analysis to identify those universities with a greater role in each area, that is, with a higher number of direct conections with the rest of the universities. Finally, we discuss the application of this methodology in a science policy context, for the search of potential scientific partners.
The BiPublishers ranking: Main results and methodological problems when constructing rankings of academic publishers.
Torres-Salinas, D., Robinson-Garcia, N., Jiménez-Contreras, E., de la Fuente, E. Revista Española de Documentación Científica, 38(4), e111. doi:10.3989/redc.2015.4.1287b
We present the results of the Bibliometric Indicators for Publishers project (also known as BiPublishers). This project represents the first attempt to systematically develop bibliometric publisher rankings. The data for this project was derived from the Book Citation Index, and the study time period was 2009-2013. We have developed 42 rankings: 4 for by fields and 38 by disciplines. We display six indicators by publisher divided into three types: output, impact and publisher’s profile. The aim is to capture different characteristics of the research performance of publishers. 254 publishers were processed and classified according to publisher type: commercial publishers and university presses. We present the main publishers by fields. Then, we discuss the main challenges presented when developing this type of tools. The BiPublishers ranking is an on-going project which aims to develop and explore new data sources and indicators to better capture and define the research impact of publishers.
Analyzing the citation characteristics of books: edited books, book series and publisher types in the Book Citation Index.
Torres-Salinas, D.; Robinson-Garcia, N.; Cabezas-Clavijo, Á.; Jiménez-Contreras, E. Scientometrics,98(3), 2113-2127. doi:10.1007/s11192-013-1168-4.
This paper presents a first approach to analyzing the factors that determine the citation characteristics of books. For this we use the Thomson Reuters’ Book Citation Index, a novel multidisciplinary database launched in 2011 which offers bibliometric data on books. We analyze three possible factors which are considered to affect the citation impact of books: the presence of editors, the inclusion in series and the type of publisher. Also, we focus on highly cited books to see if these factors may affect them as well. We considered as highly cited books, those in the top 5% of those most highly cited in the database. We define these three aspects and present results for four major scientific areas in order to identify differences by area (Science, Engineering & Technology, Social Sciences and Arts & Humanities). Finally, we report differences for edited books and publisher type, however books included in series showed higher impact in two areas.
Coverage, specialization and impact of scientific publishers in the ‘Book Citation Index’.
Torres-Salinas, D.; Robinson-Garcia, N.; Campanario, J.M.; Delgado López-Cózar, E. Online Information Review, 38(1), 24-42. doi:10.1108/OIR-10-2012-0169.
Purpose: The aim of this study is to analyze the disciplinary coverage of the Thomson Reuters’ Book Citation Index database focusing on publisher presence, impact and specialization. Design/Methodology/approach: We conduct a descriptive study in which we examine coverage by discipline, publisher distribution by field and country of publication, and publisher impact. For this the Thomson Reuters’ Subject Categories were aggregated into 15 disciplines. Findings: 30% of the total share of this database belongs to the fields of Humanities and Social Sciences. Most of the disciplines are covered by very few publishers mainly from the UK and USA (75.05% of the books), in fact 33 publishers concentrate 90% of the whole share. Regarding publisher impact, 80.5% of the books and chapters remained uncited. Two serious errors were found in this database. Firstly, the Book Citation Index does not retrieve all citations for books and chapters. Secondly, book citations do not include citations to their chapters. Research limitations/implications: The Book Citation Index is still underdeveloped and has serious limitations which call into caution when using it for bibliometric purposes. Practical implications: The results obtained from this study warn against the use of this database for bibliometric purposes, but opens a new window of opportunities for covering long neglected areas such as Humanities and Social Sciences. The target audience of this study is librarians, bibliometricians, researchers, scientific publishers, prospective authors and evaluation agencies. Originality/Value: There are currently no studies analyzing in depth the coverage of this novel database which covers monographs.
Mapping citation patterns of book chapters in the Book Citation Index.
Torres-Salinas, D.; Rodriguez-Sánchez, R.; Robinson-Garcia, N.; Fdez-Valdivia, J.; García, J.A. (2013). Journal of Informetrics, 7(2): 412-424. doi:10.1016/j.joi.2013.01.004.
In this paper we provide the reader with a visual representation of relationships among the impact of book chapters indexed in the Book Citation Index using information gain values and published by different academic publishers in specific disciplines. The impact of book chapters can be characterized statistically by citations histograms. For instance, we can compute the probability of occurrence of book chapters with a number of citations in different intervals for each academic publisher. We predict the similarity between two citation histograms based on the amount of relative information between such characterizations. We observe that the citation patterns of book chapters follow a Lotkaian distribution. This paper describes the structure of the Book Citation Index using ‘heliocentric clockwise maps’ which allow the reader not only to determine the grade of similarity of a given academic publisher indexed in the Book Citation Index with a specific discipline according to their citation distribution, but also to easily observe the general structure of a discipline, identifying the publishers with higher impact and output.
Analyzing data citation practices using the Data Citation Index.
Robinson-Garcia, N., Jiménez-Contreras, E., Torres-Salinas, D. Journal of the Association for Information Science and Technology. Early view. doi:10.1002/asi.23529
We present an analysis of data citation practices based on the Data Citation Index from Thomson Reuters. This database launched in 2012 aims to link data sets and data studies with citations received from the other citation indexes. The DCI harvests citations to research data from papers indexed in the Web of Science. It relies on the information provided by the data repository as data citation practices are inconsistent or inexistent in many cases. The findings of this study show that data citation practices are far from common in most research fields. Some differences have been reported on the way researchers cite data: while in the areas of Science and Engineering and Technology data sets were the most cited, in Social Sciences and Arts and Humanities data studies play a greater role. A total of 88.1 percent of the records have received no citation, but some repositories show very low uncitedness rates. Although data citation practices are rare in most fields, they have expanded in disciplines such as crystallography and genomics. We conclude by emphasizing the role that the DCI could play in encouraging the consistent, standardized citation of research data; a role that would enhance their value as a means of following the research process from data collection to publication.
New data, new possibilities: Exploring the insides of Altmetric.com
Robinson-Garcia, N.; Torres-Salinas, D.; Zahedi, Z.; Costas, R. El profesional de la información, 23(4), 359-366. doi:10.3145/epi.2014.jul.03.
This paper analyzes Altmetric.com, one of the most important altmetric data providers currently used. We have analyzed a set of publications with DOI number indexed in the Web of Science during the period 2011-2013 and collected their data with the Altmetric API. 19% of the original set of papers was retrieved from Altmetric.com including some altmetric data. We identified 16 different social media sources from which Altmetric.com retrieves data. However five of them cover 95.5% of the total set. Twitter (87.1%) and Mendeley (64.8%) have the highest coverage. We conclude that Altmetric.com is a transparent, rich and accurate tool for altmetric data. Nevertheless, there are still potential limitations on its exhaustiveness as well as on the selection of social media sources that need further research.
Tendencias en mapas de la ciencia: Co-uso de información científica como reflejo de los intereses de los investigadores.
Torres-Salinas, D.; Jiménez-Contreras, E.; Robinson-Garcia, N. El profesional de la información, 23(3), 253-258. doi:10.3145/epi.2014.may.05.
This paper explores de possibility of constructing science maps based on the co-use of the staff of an academic institution make of scientific literature. For this, we define co-use as the co-occurrence of scientific information requests by a given user in platform of scientific journals. We use request data from the University of Navarre to the ScienceDirect platform in 2012 in order to analyze the potential of such methodological approach. We conclude by emphasizing the viability of such methodology when exploring the research interests of an academic institution along with the relations between different disciplines.
The Google Scholar Experiment: how to index false papers and manipulate bibliometric indicators
Delgado López-Cózar, E.; Robinson-Garcia, N.; Torres-Salinas, D. (2014). Journal of the Association for Information Science and Technology, 65(3), 446-454. doi:10.1002/asi.23056.
Google Scholar has been well received by the research community. Its promises of free, universal and easy access to scientific literature as well as the perception that it covers better than other traditional multidisciplinary databases the areas of the Social Sciences and the Humanities have contributed to the quick expansion of Google Scholar Citations and Google Scholar Metrics: two new bibliometric products that offer citation data at the individual level and at journal level. In this paper we show the results of a experiment undertaken to analyze Google Scholar’s capacity to detect citation counting manipulation. For this, six documents were uploaded to an institutional web domain authored by a false researcher and referencing all the publications of the members of the EC3 research group at the University of Granada. The detection of Google Scholar of these papers outburst the citations included in the Google Scholar Citations profiles of the authors. We discuss the effects of such outburst and how it could affect the future development of such products not only at individual level but also at journal level, especially if Google Scholar persists with its lack of transparency.