The growing use of Google Scholar (GS) as an information and scientific evaluation source is an indisputable fact. Since the products made by Google are widely reported and raise sometimes acrimonious debates, it is essential to provide empirical evidences based on research about its uses and applications, determining accurately what their strengths and weaknesses as data source are.
Just to address this need, the EC3 Research Group makes available to the scientific community the GOOGLE SCHOLAR DIGEST. Research on Google Scholar: empirical evidences. This newsletter comes with the intention of reporting the results of upcoming research oriented to shed light about Google Scholar and its derivatives (Google Scholar, Google Scholar Citations, and Metrics).
Only those studies including an empirical perspective, i.e. research derivatives producing quantitative results that can be contrasted, verified and compared in order to check its reliability and validity, will be included in this blog.
Therefore, all publications containing opinions, thoughts or discussions are excluded although they may be of great interest since they do not provide empirical evidence that can demonstrate the usefulness or uselessness, the advantages or disadvantages of using Google Scholar as a tool for information and scientific evaluation.
This newsletter contains the following 3 sections:
- Bibliography: all empirical research published so far, with their corresponding abstracts will be included.
- News: review of the main findings and conclusions of the empirical studies as they are published.
- Reports: analysis of certain issues or aspects relevant in the research of Google Scholar:
To inaugurate the blog we present a first report entitled Empirical Evidences in Citation-Based Search Engines: Is Microsoft Academic Search dead? This working paper intends to summarize the main empirical evidences provided by the scientific community as regards the comparison between the two main citation-based academic search engines: Google Scholar (GS) and Microsoft Academic Search (MAS), paying special attention to the following issues: coverage; correlations between journal rankings; and usage of these academic search engines.
Additionally, self-elaborated data is offered, which are intended to provide current evidence about the popularity of these tools on the Web, by measuring the number of rich files (PDF, PPT and DOC) in which these tools are mentioned, the amount of external links that both products receive, and the search queries’ frequency from Google Trends.
The poor results obtained by MAS led us to an unexpected and unnoticed discovery: Microsoft Academic Search is outdated since 2013. Therefore, the second part of the working paper aims at advancing some data demonstrating this lack of update. For this purpose we gathered the number of total records indexed by MAS since 2000. The data shows an abrupt drop in the number of documents indexed from 2,346,228 in 2010 to 8,147 in 2013 and 802 in 2014.