Responsible metrics

There are various philosophical discussions about the merits of using citation scores and metrics as measures of research quality, and it is important to note different cultures of citation, authorship and publication rates in different disciplines.  There is a wide range of activity in the area of research metrics and impact measurement to ensure that measures are used responsibly and that stakeholders experience a level playing field, notably: 

Three key themes in responsible metrics are:

  • The need to avoid and challenge use of the Journal Impact Factor (JIF) as a proxy measure of quality for individual research articles
  • The importance of only comparing like with like.  For example, do not compare researchers at different career stages, or in different disciplines.  The use of normalised indicators goes some way to addressing this issue.
  • Promoting the idea of a “Basket of Metrics”.  Whilst a single metric could be “gamed” it is more difficult to game a range of metrics.  A basket of metrics also provides a more convincing statement of your research excellence.

 

More themes in responsible metrics

  • Quantitative evaluation should support qualitative expert assessment (DORA 3, 15, Leiden 1, 7, Metric Tide 1, 17a)
  • The value and impact of all types of research outputs needs to be considered  (DORA 3, Leiden 6)
  • A move towards a range of article based metrics is needed (DORA 7, 17, Metric Tide 4, 8)
  • The need for responsible authorship practices, relating to the specific contributions of authors (DORA 8)
  • Research performance should be measured against the research missions of the institution, group or researcher (Leiden 2, 3, Metric Tide 4)
  • Transparency of how metrics are generated, explained, and used by decision makers is needed(Leiden 4, 5, Metric Tide 6, 7, 9)
  • Normalized indicators are required (Leiden 6)
  • It should be noted that h-index varies by discipline and is database dependent (Leiden 7)
  • False precision should be avoided (Leiden 8, 9)
  • Responsible use of metrics is important from an equality and diversity perspective (Metric Tide 2, 3)
  • Unique identifiers such as ORCiD, ISNIs (for institutions) and DOIs should be advocated (Metric Tide 3,10, 11, 12, 13)

Next generation metrics

The Global Research Council’s 2021 Conference Report “Responsible research assessment” acknowledged that “research assessment shapes research culture”, impacting on how researchers undertake and disseminate their research.  Ideally research assessment should encourage team science, transdisciplinary science, collaboration, open research, reproducibility, EDI and enable greater mobility between industry and academia.  Next generation metrics attempt to address these problems, and we outline some of the initiatives below.

  • Before a publication appears with author names attached, a huge amount of work goes into the associated research project.  Contributions will have been made by many people whose name won’t appear on the paper as an author.  Initiatives such as CRediT (Contributor Roles Taxonomy) are being explored, to give recognition to people in roles other than principal investigator and lead researchers. 
  • Following on from the Metric Tide (2015), Wilsdon et al published the 2017 report Next-generation metrics: responsible metrics and evaluation for open science.  The report aims to guide further work by including 12 targeted recommendations along the themes of the European Open Science Agenda - developing research infrastructures, fostering and removing barriers to Open Research, and embedding Open Research in society.  To learn more about Open Research practices, please consult our Open Research page and Open Research Canvas course.
  • In 2020 CESAER, the association of European Science and Technology Universities, published a white paper on Next Generation Metrics in the light of greater openness in research, with the aim of moving away from a culture of competition- and reputation-based assessment towards a culture of quality, trust and risk-taking.
  • UKRI’s Future Research Assessment Programme  (FRAP) seeks to understand what a healthy, thriving research system looks like and how an assessment model can best form its foundation. Expecting to conclude by late 2022, the work strands include:
    • evaluation of REF 2021
    • understanding international research assessment practice
    • investigating possible evaluation models that encourage excellent research and impact, support a positive research culture, and reduce the administrative burden on the HE sector.
  • The Hong Kong Principles were devised to reward and recognize scholars for behaviour that contributes to trustworthy research.

Further help

Library Services offers online training on responsible metrics via our Influential Researcher Canvas Course

For one-to-one appointments and bespoke workshops, contact the Research Skills Team in Library Services.

Colleges

Professional Services