Decision makers in the research landscape have traditionally used publication metrics to gain insights into the performance of individual researchers. However, with the drive to improve research culture, the HE sector is moving towards a narrative CV approach. This generally takes the form of a story told around discrete publications and normalised citation performance.
This page highlights indicators that may be used as evidence in promotion or funding applications, or to identify impactful papers. For best results, we recommend curating your online publication profile before generating research metrics.
Research metrics for your portfolio include citation metrics, altmetrics and collaboration indicators.
Citation Metrics for your portfolio
Citation metrics ("bibliometrics") analyse the number of times other researchers refer to (cite) a given publication within scholarly publishing. There are large differences across the disciplines in relation to cultures of citation, authorship and publication rates, with some disciplines disadvantaged as their key publications exist outside the scope of citation databases (e.g. law journals on specialist law databases, and other disciplines emphasising preprints over the published article). Whilst normalised metrics address this to a point, the various philosophical discussions about the merits of using bibliometrics have prompted a range of activity in the area of responsible metrics.
You may be asked for author metrics as part of a promotion or other process - these can be traditional or normalised.
Traditional author metrics
These include:
- Citation count, or Sum of times cited, a simple measure of the number of citations for an article, researcher or publication
- h-index, a score that attempts to summarize a researchers impact and productivity with a single number. It is controversial because it tends to favour late-career researchers in disciplines with fast publication rates and cultures of multiple authorship.
Traditional author metrics can be obtained from author profiles on Web of Science, Scopus, SciVal, Google Scholar and Harzing's Publish or Perish.
Normalised author metrics
These include:
Check your profile and author metrics on each service and quote the best ones, ensuring you acknowledge the source of your score, and the date it was generated (different databases will give different scores due to their differing coverage). Whilst normalised author metrics are perceived as fairer than the h index, they are not a perfect solution and should be augmented with some of the article metrics detailed below.
Article metrics may be "traditional", normalised, patent or smart.
Traditional article metrics
These include the citation count, which are usually displayed in the results list and can be obtained from the three key platforms, Web of Science, Scopus and Google Scholar.
Normalised article metrics
Normalised article metrics include:
- Highly Cited on Web of Science. The highly cited award is given to papers in the top 1% in each of the 22 Web of Science subject categories, per year, and displays as a small golden trophy in the results list and on the article page.
- Hot papers on Web of Science. Hot papers are in the top 0.1% of papers by citations for field and age, and display as a small red flame in the results list and on the article page.
- Percentiles on Web of Science. Percentiles can be found via the full author beamplot (see above), and hovering over the purple points. A higher percentile value means better performance, so if a paper has a percentile of value of 99, then it is in the top 1% of most cited publications of the same document type, year and subject category.
- Field weighted citation impact (FWCI) on Scopus. The FWCI is a score based on the number of citations received, compared to expected number of citations for similar documents. A FWCI score of 1 indicates that a paper is receiving the expected number of citations for a paper of that age, discipline and type; higher than 1 means that your paper has a higher than average number of citations. FWCI is available by clicking on 'View all metrics' in the Scopus document page.
- Percentile benchmark on Scopus complements the FWCI and compares items of the same age, subject area and document type over 18 months. 99th percentile is high, and indicates an article in the top 1% globally. Percentiles are available by clicking on 'View all metrics' in the Scopus document page
Patent citations
Patent citations count how many times your research output has been mentioned and cited by patents. These are available through the following sources:
- Altmetrics (see section below)
- SciVal counts how many patents cite your research publications, and also how many of your publications have generated these patent citations. Go to your profile, then in the overview section click on the Patent Impact tab. NB: results will vary depending on the date range you select.
Smart citation metrics
Smart citation platforms such as Scite and Semantic Scholar use artificial intelligence to give more context to citations:
- Scite enables you to see how many citations are supporting, disputing or mentioning (personal subscription required, although you can register for a free seven-day trial).
- Semantic Scholar shows how many papers cite the background, results, methodology etc, and how many are "highly influenced papers" - those which have cited your article several times.
- Web of Science (WoS) offers enriched cited references for a proportion of their papers. For these, WoS assigns their citations to one of five categories - background, basis, support, differ and discuss. "Snippets" around the in-text citations can be viewed. Some papers also provide a visualisation to enable users to see where in the papers the citation sits (x axis), and the 5 category functions (y axis). These can be accessed by undertaking a search then using the filter to limit results to Enriched cited references.
Altmetrics for your portfolio
A type of article level metric, altmetrics score and link to the attention a research output has received in social media and other platforms. The qualitative information that is surfaced by altmetrics can be used to augment your author and article citation metrics. Some of the key altmetric information includes:
- where your work has been cited in policy documents
- patent citations
- media, particularly news, mentions
- wikipedia citations
- social media activity, indicating where there is interest in specific community groups, or more public engagement
This six-minute video gives an overview of how University of Birmingham researchers can find Altmetrics for their research outputs.
Finding Altmetric information
The two key altmetric services are plum (providing the plum print), and altmetric.com (providing the altmetric 'donut') - their summaries allow you to see the details of the attention received, and to link out to the sources. The plum print and altmetric donut are embedded in a number of places including publisher platforms, bibliographic databases (e.g. Scopus - go to the document record, then "View all metrics" to see the plum print), and the University of Birmingham Research Portal.
Researchers can also download the Altmetric It! bookmarklet tool to their browser toolbar, and access the altmetric donut and underlying information for any document with a DOI:
- Go to Altmetric for Researchers and select 'Tools'. Click to 'Learn more' for the Altmetric bookmarklet.
- Complete the form, then click and drag the bookmarklet onto your toolbar.
- Search for a key paper in your area on your usual service, then link out to the full text via a DOI link. Click on the 'Altmetric it!' bookmarklet on your toolbar to generate the Altmetric donut. Clicking on the score generates an altmetric summary including which social media channels have picked up on the paper.
Note: it is advised to use the Altmetric attention score only as a preliminary indicator of the amount of attention an item has received. It can help to identify where there are ‘mentions’, and signifies where an item has achieved a high level of engagement.
Tip: if your Pure profile is up-to-date, you may like to start exploring altmetrics from your list of Research Outputs on the Research Portal.
This blog post will give you some ideas and examples of how you can communicate, and incorporate altmetrics, into a narrative format:
Collaboration indicators
Collaboration indicators are based on the co-authors on your publications and can be obtained from two sources:
- University of Birmingham's Research Portal provides a clickable world map that displays your shared research output. Go to your profile, then scroll down through the overview to the network section. NB: the data will only be as complete as the list of publications on your Pure profile.
- SciVal provides percentages of international, national and institutional collaboration, as well as Academic-Corporate collaboration, based on your publications indexed in Scopus. Go to your profile, then in the overview section click on the collaboration tab. NB: results will vary depending on the date range you select.
A move away from traditional author metrics?
Before a publication appears with author names attached, a huge amount of work goes into the associated research project. Contributions will have been made by many people whose name won’t appear on the paper as an author. Initiatives such as CRediT (Contributor Roles Taxonomy) are being explored, to give recognition to people in roles other than principal investigator and lead researchers.
Further help
Libraries and Learning Resources offers online training via our Influential Researcher Canvas Course
For one-to-one appointments and bespoke workshops, contact the Research Skills Team in Libraries and Learning Resources.