Wednesday, 18 May 2016

Elsevier purchase SSRN: Social scientists face questions over whether centralised repository is in their interests.

The Social Science Research Network (SSRN), an online repository for uploading preprint articles and working papers, has been recently acquired by publishing giant Elsevier. Thomas Leeper looks at what this purchase, and for-profit academic services more generally, mean for the scholarly community. Many regular users may not be aware that SSRN has been run by a privately held corporation since its founding in 1994.

What are the most-cited publications in the social sciences (according to Google Scholar)?

Drawing on citation data that spans disciplines and time periods, Elliott Green has identified the most cited publications in the social sciences. Here he shares his findings on the 25 most cited books as well as the top ten journal articles. The sheer number of citations for these top cited publications is worth noting as is the fact that no one discipline dominates over the others in the top 20, with the top six books all from different disciplines.

Monday, 16 May 2016

Best universities in the UK 2016 from the THE World University Rankings

This UK university league table reveals the 78 best UK universities and colleges, according to the Times Higher Education World University Rankings 2015-2016. The University of Oxford and the University of Cambridge take the top two spots in the UK ranking, while universities in London fill out the rest of the top five.

Bias against novelty in science

Bias against novelty in science | VOX, CEPR’s Policy Portal

There is growing concern that funding agencies supporting scientific research are increasingly risk-averse, favouring safe projects at the expense of novel projects exploring untested approaches. This column uses the citation trajectories for over 1 million research papers to examine the impact profile of novel research. Novel papers tend to suffer from delayed impact, but are more likely to become big hits in the long run and to generate follow-up research. The short time windows of the bibliometric indicators that are increasingly used by funding agencies in their decision-making may bias funding decisions against novelty.

Monday, 25 April 2016

Algorithmic accountability in scholarship: what we can learn from #DeleteAcademiaEdu

Algorithmic accountability in scholarship: what we can learn from #DeleteAcademiaEdu The controversy surrounding Academia.edu highlights the flaws and limitations of existing scholarly infrastructures. Jean-Christophe Plantin explores the intersection of algorithms, academic research and platforms for scholarly publications. He argues that there is a need to develop a values-centred approach in the development of article-sharing platforms, with suitably designed algorithms.

Monday, 4 April 2016

Research gets increasingly international Big US report documents increases in international collaboration and Chinese science output.

China’s share of global science and engineering publications has pulled within a percentage point of those from the United States, according to the latest research statistics published by the US National Science Foundation (NSF). The agency's report, released on 19 January, also underscores the rising importance of international scientific collaboration. Between 2000 and 2013, the percentage of publications with authors from multiple countries rose from 13.2% to 19.2%. Interestingly "The 2016 Indicators report changed the metrics by which it measures publications. Instead of using the Thomson Reuters Science Citation Index and the Social Science Citation Index, the NSF went with Elsevier’s Scopus database. The change was made to try to get a more accurate view of global trends, says Carol Robbins, the NSF senior analyst who oversaw the bibliometrics portion of the report. By using Scopus, the 2016 analysis was able to look at roughly 17,000 journals, compared to the 5,087 included in the previous report two years ago."

Monday, 18 January 2016

Interested in Alternative Metrics (Altmetrics) for evaluating your Research Performance?

Come to the PlumX presentation at 1.30 in the Library on the 26th Jan as part of the Faculty of Science and Engineering Publications Festival. They will also be available before and after their talk to take any questions you may have. PlumX provide analytics to help understand what has happened to your Research by looking at Impact beyond just Citations. They categorize metrics into five separate types: Usage (clicks, downloads, views, library holdings, video plays), Captures ( bookmarks, code forks, favorites, readers, watchers), Mentions (blog posts, comments, reviews, Wikipedia links), Social Media (+1s, likes, shares, tweets), and Citations (PubMed Central, Scopus, patents). For more details check out their Website at http://plumanalytics.com/learn/about-metrics/ The Library currently has a trial for the product and the Research Support Librarian (ciaran.quinn@nuim.ie) will be happy to demonstrate if required. The Library would also be interested in any feedback you may have on the value of such a product. If you are interested in seeing how this might look at an institutional level, the University of Pittsburgh have it up and running at https://plu.mx/pitt/g/

Friday, 15 January 2016

Altmetrics - a social revolution or just a hype?

Altmetrics - a social revolution or just a hype?: Is social impact measurement by altmetrics a valid method for governmental publications?

The traditional method of measuring the importance of scholarly publications is based on citation measurement in other scholarly journals.   But social impact can also be determined by figures about how publications are shared, downloaded, bookmarked, mentioned, liked, retweeted and cited on social platforms.  This data reveals more information about how the information and knowledge contained in the publication are being actively used.  For the academic world this type of measurement can be a valuable addition to well-established Journal Impact Factors (JIF).

Monday, 11 January 2016

Italy’s Research Evaluation Exercise | The Academic Executive Brief

Italy’s Research Evaluation Exercise | The Academic Executive Brief



The Italian National Agency for the Evaluation of the University and Research Systems (ANVUR) is starting a new project aimed at evaluating research outcomes published by Italian professors and researchers from 2011-2014. Overall, they expect over 130,000 publications to be evaluated.

The goal of the exercise is to evaluate the quality of the research conducted in Italian universities, and to rank these institutions and their departments in each of the 16 research areas that comprise all research activities in Italy. ANVUR has designated 400 assessors as the Group of Evaluation Experts (GEV) whose evaluation will significantly inform the distribution of public funds.