Australasian Science: Australia's authority on science since 1938

Alternative Measures of Research Impact

Credit: madpixblue/Adobe

Credit: madpixblue/Adobe

By Paul X. McCarthy

New “altmetrics” tools are enabling universities and the private sector to identify emerging talent much earlier than traditional measures of academic publications and citations.

Since the 1960s the main measure of much of the world’s research has been the number of published references or citations it receives. For most academics, career success and the acquisition of grant money to continue their work depends heavily on the total amount of citations they receive from others, which in turn is usually related to the volume of work they have published.

But not all publishing venues are equal. A paper published in highly competitive journals such as Nature, Science or PNAS, or a citation received in another author’s paper, is likely to have more significance than one in less competitive journals.

Furthermore, the citations approach naturally disadvantages early-career researchers. Citations tend to accumulate slowly, and even the most productive authors are unlikely to amass many in the first few years.

Statistical analysis of citations, known as bibliometrics, allows us to measure the currency, influence and impact of the work of scholars who are working within the same field or discipline around the world. Despite the growth in data and tools to better understand the impact and influence of scholarship, there remain several key challenges with traditional bibliometrics.

  • How can the quality, not just quantity, of work be assessed?
  • How can scholars working in different domains be compared?
  • How can scholars at different stages in their career be compared?

In the past decade, the term “altmetrics” has appeared as a way to refer to alternative and new online measures of academic research impact. The challenge is to assess a body of research work for scholars in the same field, and not just the number of peer-reviewed papers published or the cumulative citations of each of these.

For example, in 2005 theoretical physicist Dr Joerge Hirsh of The University of California, San Diego developed the h-index, which measures how productive and influential a researcher is according to research output (the number of publications published) as well as their quality (measured by the number of citations for each).

However, one of the main criticisms of the h-index is that it doesn’t work across scholars working in different disciplines, or even within disciplines in different areas. Part of the problem is that each discipline has its own patterns of scholarship, citation and publishing. In humanities and some social sciences, the main unit of production is the book, whereas in some sciences it’s the refereed journal article. In some fast-moving technology fields it is the conference paper.

Although it wasn’t intended for this purpose, h-index scores are increasingly being used to assess talent for tenure, hiring and promotion.

Assessing Early-Career Talent

In 2001, The University of Manchester made one of the most strategic hires in the past 20 years, appointing physicist and graphene pioneer Andre Geim to his first full professorship. Back then, when the University hired him, Geim had a very solid but not spectacular track record with less than 1000 career citations to his name. Just under a decade later in 2010, Geim and his former student and long-time research partner Konstantin Novoselov were awarded the Nobel Prize in Physics for research published only 3 years after Geim’s appointment. The University of Manchester continues to lead graphene and many related areas of research worldwide – in June 2016 the League of Scholars ranked the University first in the world with 46 Top 500-ranked graphene scholars.

Identifying early-career talent has become increasingly important as competition among universities for talent becomes more intense and more global. However, early-career academics are heavily disadvantaged by many current measures of output and citations. Furthermore, citation patterns vary significantly by discipline, so comparing scholars from different fields or domains is notoriously difficult.

A number of solutions are emerging to accommodate younger early-career scholars.

  • An individual annual h-index called hIa has been created by the co-founder of Publish or Perish software, Prof Ann-Wil Harzing, to accommodate disciplinary and career-length differences. This index (https://harzing.com/download/hia.pdf) reflects the annual increase in an academic’s individual h-index, so it’s capable of dealing fairly with scholars from fields with differing citation and publishing modes as well as early-career academics.
  • The League of Scholars (https://www.leagueofscholars.com/) is a Google-like whole-of-web index to help universities and companies identify high-potential early-career talent for strategic hiring and promotion. The League of Scholars index uses an algorithm like Google’s PageRank to give more weight to journals and publishers with higher authority while including industry engagement and media mentions.
  • Read counts from Mendeley (https://www.mendeley.com/), Elsevier’s reference management tool and social network for researchers, is a good predictor of future success according to Prof Mike Thelwall of The University of Wolverhampton, who is perhaps the world’s top authority on altmetrics. “Mendeley readers probably accumulate 1–1.5 years before citations,” he said via email. “Mendeley readers are otherwise similar to citations and the wider trend that is most promising for me is the diversity of indicators.”

In parallel with this, patents are becoming “the new papers” in China, according to Simon De Wulf, CEO and founder of global online patent analytics platform Patent­Inspiration (http://www.patentinspiration.com). DeWulf says that many PhD students now focus on filing patents instead of academic papers as they are considered by many to have a higher return on investment.

Other Altmetrics

In addition to better approaches to bibliometrics, many are now looking at measuring other impacts outside of traditional scholarship. A number of new online services have been created to help authors, publishers and research institutions better track and understand impact beyond traditional publishing measures.

  • Publons (https://publons.com), founded in 2007 and acquired by analytics firms Clarivate in June, allows scientists to track and showcase their peer-reviewing activities. This creates a unit of currency for the previously unrewarded job of reading and critiquing the work of other academics.
  • Impactstory (https://impactstory.org) is an open-source online tool that provides altmetrics to help researchers measure and share the impacts of all their research outputs. It provides one central place for authors to collect and display social media mentions of their work across Twitter, Wikipedia, Facebook and news articles. This can help researchers as well as readers understand what is trending and resonating in their field.
  • Plum Analytics (http://plumanalytics.com/), founded in early 2012 in Philadelphia and acquired by Ebsco in 2014 and then in February 2017 by publisher Elsevier, helps universities and other research organisations to collate citations, usage, mentions, captures and social media metrics. This can also be connected to academic online publishing repositories.

The interest in altmetrics is only likely to continue to grow in the coming decade as investigators, universities and research investors continue to strive for new perspectives on the significant impacts of research.


Paul X. McCarthy is an Adjunct Professor of Computer Science at UNSW Australia and co-founder of League of Scholars. His book Online Gravity (Simon & Schuster) explains how the web has changed the fundamental laws of economics and business.