4 nov 2015

Bibliometrics & the bibliometricians in Google Scholar Citations and ResearcherID, ResearchGate, Mendeley, Twitter

In keeping with the research line the EC3 Research Group began several years ago aimed at unravelling the inner depths of Google Scholar and testing its capabilities as a tool for scientific evaluation, this time we have turned our efforts to finding new uses for Google Scholar Citations (GSC). Based on the information available on every GSC public profile, a procedure has been developed to collect data from the scientists working on a given field of study, and to aggregate that data in order to present metrics at various levels: authors, documents, journals, and book publishers. Thus, GSC data would presumably allow us to present a picture of the history and scientific communication patterns of a discipline. In order to explore the feasibility of this project, we decided to select the field of Bibliometrics, Scientometrics, Informetrics, Webometrics, and Altmetrics as our test subject.
Once we’ve seen the picture of the discipline that can be observed through the data available in GSC, we also want to compare it to its counterparts in other academic web services, like ResearcherID, a researcher identification system launched by Thomson Reuters, mainly built upon data from Web of Science (which has been and still is the go-to source for many researchers in the field of research evaluation), and other profiling services which have arisen in the wake of the Web 2.0 movement: ResearchGate, an academic social network, and Mendeley, a social reference manager which also offers profiling features. These are the most widely known tools worldwide for academic profiling . In addition, we also include the links to the authors' homepages (the first tool researchers used to showcase their scientific activities on the Web), and Twitter, the popular microblogging site, in order to learn how much presence bibliometricians have in this platform and the kind of communication activities in which they take part there. 

28 different indicators from 813 authors are displayed. The data is presented "as is": no filtering or cleaning of the data has been carried out. From the ranking of 813 bibliometricians who have made their Google Scholar Citations profile public, and the top 1057 most cited documents in those profiles, two additional rankings have been developed: a journal ranking, and an publisher ranking  according to the number of citations received.
In short, our aim is to present a multifaceted and integral perspective of the discipline, as well as to provide the opportunity for an easy and intuitive comparison of these products and the reflections of scientific activity each of them portrays. In addition, we also want to bring attention to the new platforms that are offering scientific performance metrics and look into what their meaning could be. With this step, we enter the altmetrics debate, but with a different approach: we do it from the individuals' perspective, and not only from the perspective of the documents they publish. In short, to notice what these tools really measure while applying it precisely to those who measure

We are currently on an analysis of the data displayed in this product, which will be presented shortly in a working paper.

The product is accessible from: