27 sept 2017

Metric study of information literacy in latin america: from bibliometrics to altmetrics

Uribe-Tirado, A.; Alhuay-Quispe, J. (2017). 
Estudio métrico de ALFIN en Iberoamérica: de la bibliometría a las altmetrics. 
Revista Española de Documentación Científica,40(3): e180

This study identifies the presence, productivity and influence of Ibero-American authors that write about information literacy (InfoLit). Using bibliometric and altmetric indicators, it seeks to analyze the impact and subsequent use of their scholarly works on social and scientific platforms. Fifty-five authors with the highest productivity were identified, based on the results of bibliometric studies on InfoLit carried out on both an international and Ibero-American scale in searches of major databases as well as publications collected in a Latin American wiki. 

Subsequently an analysis of bibliometric and altmetric indicators at the author and publication level was carried out, based on the results of searches on eight scientific platforms (Google Scholar, ResearchGate, Academia.edu, Mendeley, ORCID, IraLIS, E-LIS and EXIT), three social networks (Facebook, Twitter and LinkedIn), and data provided by a commercial supplier (Altmetric.com). 

Overall we found a greater presence of authors in ResearchGate (58%), Academia.edu (51%) and Google Scholar (49%) as opposed to Mendeley (25%) and ORCID (18%). Furthermore, as to social platforms, the greatest potential influence lies with Facebook, due to its high number of followers ( / top 10 authors). In addition, an analysis with the Spearman rho statistic, shows among some sources and platforms, a low correlation between the number of citations in Google Scholar and readings in Mendeley (r = 382) and low negative for mentions in blogs (r = 0,-237), Google+ (r = 0, -214) and Twitter (r = 0, -183). In conclusion, both the productivity and the impact-visibility center on specific authors writing about InfoLit, and various measurement resources show that for these authors there is a positive two-way impact from bibliometric to altmetric and vice versa

Comparing the number of citations of Google Scholar and Researchgate in a sample of Colombian researchers

Aguillo-Caño I., Uribe-Tirado A., López-López W. (2017)
Visibilidad de los investigadores colombianos según sus indicadores en Google Scholar y ResearchGate. Diferencias y similitudes con la clasificación oficial del sistema nacional de ciencia -COLCIENCIAS. 
Revista Interamericana de Bibliotecología, 40(3), 221-230. 
doi: 10.17533/udea.rib.v40n3a03

The aim of this study is to contextualize the results obtained regarding the classification of researchers who work in Colombian institutions according to their public Google Scholar - GSC citation profiles (1390 with an H index equal or higher than 5). 
To this end, the study compares its findings with the data obtained from the collection of Colombian authors on a social network named ResearchGate - RG and the local information provided by Colciencias, which is the Colombian agency that publishes researcher classification from a platform named ScienTI. 
Findings revealed significant discrepancies between GSC and RG findings regarding the four classification categories Colciencias provided. This suggests that Colciencias must reconsider its assessment criteria including new sources and indicators. Considering that the two sources (GSC, RG) and the (h index, RG-Index) indicators behave differently regarding disciplines, Colciencias must also be careful with disciplinary assignments adopting International classifications and developing discipline related indicators. Colombian academic and research organizations should become more active in recognizing the potential and importance of Internet platforms to visibilize their research work and increase its impact (Ciencia 2.0)

25 sept 2017

International Survey of Research University Faculty: Use of Bibliometric Ratings, Identifiers & Indicators

International Survey of Research University Faculty: Use of Bibliometric Ratings, Identifiers & Indicators
Report by Primary Research Group
ISBN No:978-157440-472-2

This study presents data from 325 faculty of major universities in the USA, Canada, the UK, Ireland and Australia about how they view bibliometric indicators such as the h-index, how trustworthy they are believed to be and how often they are checked or calculated.

Characteristics of the Sample 
Personal Awareness of the h-index 
Frequency of Checking the H-Index 
Use of Journal Impact Factor Ratings and Rankings When Deciding on Publication Venue 
Ease of Finding Journal Impact Factors 
Views on Journal Impact Factor Trustworthiness 
Use Scholar Identifiers 
Use of Thomson Reuters Research ID [sic]
Use of Orchid ID 
Use of the International Standard Number Identifier 
Use of arXiv 
Use of Web of Science 
Use of Google Scholar 
Use of Scopus 
Use of JSTORE [sic]
Use of Journal Citation 
Use of Scimago, bepress and SciVal 
Use of SciFinder 
Use of CrossRef 

The study presents data on the use of particular tools and indicators, giving specific data for all of the following: Web of Science, Scopus, Google Scholar, ORCHID ID, Thomson-Reuters Research ID, Scimago, bepress, SciVal, JSTOR, International, SciFinder, arXiv and CrossRef, among others.

Data in the report is broken out by tenure status, gender, age, semester teaching load, academic field, academic title, and political views of the survey participant, as well as by the country or origin, public/private status and world ranking of the universities of the survey participants.

Just a few of this 113-page report’s many findings are that:
  • Google Scholar was the most frequently used tool for bibliometrics, with 85% of the respondents reporting its use. All faculty age 30 or under reported using Google Scholar with this percentage declining to 77% of faculty 60 years and over.
  • SciFinder use was reported by 7% of respondents, especially at private institutions (10%), by faculty 30 years or younger (22%), 
  • 76% of faculty in the UK/Ireland had an ORCHID ID vs. only 35% in the USA.
  • Political conservatives are more likely than those with more centrist or left-wing views to feel that bibliometric measures are trustworthy.
  • Faculty in literature and languages found the most difficulty in finding journal impact factor data for use in their career planning

19 sept 2017

Tracking Scholarly Publishing of Hospitals Using MEDLINE, Scopus, WoS and Google Scholar

Pylarinou, S., & Kapidakis, S. (2017). 
Tracking Scholarly Publishing of Hospitals Using MEDLINE, Scopus, WoS and Google Scholar. 
Journal of Hospital Librarianship, 17(3), 209-216. 

Scientific literature focuses on facilitating communication among researchers. Many studies have been conducted to compare effectiveness, coverage, and performance among databases available to researchers and/or librarians. In this study, the authors compared MEDLINE/PubMed, Scopus, Web of Science (WoS), and Google Scholar performance regarding searching for scholarly publishing of institutions such as hospitals. 

Query searches of scholarly publications of specific hospital personnel run and articles results were compared.  The MEDLINE/PubMed database, Scopus and Web of Science offer the option to search by affiliation. Affiliations in Google Scholar can be searched by running a query as an exact phrase. Queries were phrased in a way that was suitable for each source as well as to enable comparison. To facilitate comparison we limited research to 2016. Data were collected at the end of August 2016.

Natural language use when authors denote affiliation affects retrieval of scholarly publishing. Effectiveness of searching scholarly publishing of a specific institution is better served when there is concurrent use of many databases.

An affiliation-based search could be better served if searchers use multiple sources in combination. In this study, a comparison of bibliographic database results gave precedence to MEDLINE/PubMed. Between free available resources MEDLINE/PubMed and Google Scholar, MEDLINE/PubMed provided better results also.

15 sept 2017

Is Google Scholar useful for the evaluation of Chinese journals?

Zhang, Y., Lun, H., & Yang, Z. (2017)
Is Google Scholar Useful for the Evaluation of Non-English Scientific Journals? The Case of Chinese Journals
iConference 2017 Proceedings (pp. 241–261). https://doi.org/10.9776/17025

This study aims to explore how useful Google Scholar is for the evaluation of non-English journals with the case of Chinese journals. Based on a sample of 150 Chinese journals across two disciplines (Library and Information Science, Metallurgical Engineering & Technology), it provides a comparison between Google Scholar and Chongqing VIP, which is an important Chinese citation database, from three aspects: resource coverage, journal ranking and citation data. 

Results indicate that Google Scholar is equipped with sufficient resources and citation data for the evaluation of Chinese journals. However, the Chinese journal ranking reported by Google Scholar Metrics is not developed enough. But Google Scholar is able to be an alternative source of citation data instead of Chinese citation databases. The Average Citation is a useful metric in the evaluation of Chinese journals with data from Google Scholar to provide a comprehensive reflection of journals’ impact. Overall, Google Scholar is useful and worthy of attention when evaluating Chinese journals.