Research metrics are also known as bibliometrics or citation metrics and describes use of statistical analysis on a dataset of bibliographic and citation data. This might also overlap with scientometrics, where other datasets about scientific publications and activity can be statistically analysed, often alongside publications and citations data, for insights into research performance, behaviours and strategy.
Proprietary literature databases such as Scopus and Web of Science are often the sources of data for metrics, but other tools have emerged as well as free tools based on open source datasets.
As quantitative assessment of research has become popularised thanks to such tools, the inappropriate use and application of research metrics has inspired a number of declarations, manifestos and principles aiming to guide their responsible use and promoting a healthier research culture.
The University of Plymouth has signed the San Francisco Declaration on Research Assessment and you can find further information on research metrics, tools and services for finding metrics, and how to responsibly use metrics in these pages.
You can also come along to one of our regular research training sessions on responsible use of metrics.
Different tools will pull on different datasets and so you may get different results depending on the data source. Note that no dataset will include every publication to ever exist. You also want to ensure that you use metrics based off a dataset you can trust in terms of quality of the data itself, how comprehensive it is for what you want to measure, and noise within the dataset i.e. are all sources from academic journals, does it include mirror/predatory journals etc.
How established a research field is, the career stage of a researcher, the discipline you are examining and other factors can affect the coverage of that research within a metrics tool and suitability for comparison against other entities.
For example, many tools do not have sufficient representation of arts journals that are not indexed within databases and article publication may not be as important in some fields as a research output as it is in others; it would be unfair to count publications from an early careers researcher and compare this to #publications from a senior researcher; you cannot compare a country's output to an institution's output, and so forth.
Make sure you use metrics that are appropriate to what you want to measure. For example, it is inappropriate to use journal impact factor as a proxy for your impact as a researcher, as the metric measures a citation average for a journal, not a person or an article.
Additionally, some metrics tools are very sophisticated and it is easy to make an invalid inference based on a graph or table. Learn about the metrics you intend to use, what they are designed to measure and what the weaknesses of that metric are. Triangulate metrics with other suitable metrics to verify findings and to provide additional insight.
Use of metrics should be supplemented with peer review/ expert opinion and qualitative measures to provide more rounded judgment. Metrics alone cannot answer complex queries and should not provide sole evidence for high-level decisions.
When using citation-based metrics or other measures of impact, it is important to be aware of the issues surrounding their improper use. There is an increasing movement towards the responsible evaluation of research to ensure that metrics are recognised as indicators and not the absolute worth of a person's research endeavours.
To find out more about how to use measures of impact in a responsible way, visit our guidance on responsible metrics.
SciVal
The University of Plymouth has access to SciVal, a powerful citations analysis tool which can be used for benchmarking research, assessing the international reach of UoP research as well as identifying potential emerging areas of research, potential collaborators and the value of our current collaborations.
Such citation data can be used in national and global league tables (though this brings with it concerns around responsible use of metrics) and can help senior managers with strategic planning.
It is important to note that SciVal is based on Scopus data (both Elsevier products) and where work is not published in journals indexed in Scopus, no analysis can be provided on these outputs via SciVal.
The Library and R&I provide support to the University on use of SciVal including a consultation and reporting service:
Scopus
Scopus is a citation database that provides the data for SciVal. It also contains tools to help you analyse an area of research, along with metrics you can export, author and institutional profiles and more.
Web of Science
Web of Science is an abstracts and indexing platform that provides access to various indexes, citation information, author profiles and more. The content available is dependent on the subscribing institution's level of access. You can use it to analyse a body of research and export basic metrics for those outputs.
Altmetrics tools
The term altmetrics (or 'alternative metrics') is often used to describe measures which capture attention a research output receives beyond the scope of more traditional metrics such as citation counts and impact factors. Often, these measures focus on impact based on online activity, for example social media mentions, blog posts, and news mentions. The University has access to two providers of Altmetrics data:
PlumX
PlumX scores are embedded into Pure Portal and Pearl records. Some limited PlumX reporting is possible via SciVal and Pure - please ask the library for further advice: openresearch@plymouth.ac.uk
Dashboards of PlumX scores and links to citing policy documents are are available through the Pearl reporting dashboard, available to all. Pearl is the repository of University of Plymouth open access outputs. PlumX data in Pearl is therefore only available to outputs deposited by staff into Pearl via Pure or legacy systems.
Altmetric
A research output's Altmetric Attention Score appears as a distinctive 'donut' which is often displayed next to published outputs. Clicking on an Altmetric donut shows you how many times an output was mentioned on different social media and news outlets.
The University has purchased Altmetric for Institutions: Altmetric Explorer and use your University email address and password in order to set up email alerts for new activity, and create, save, and export reports.
1. Sign in to Altmetrics Explorer using your University login.
2. Click on the 'My Institution' symbol on the left hand panel. From here you should be able to find the Altmetric profiles for individual University of Plymouth affiliated authors as well as faculty and school groups.
If you've received mentions in Altmetrics, you can explore these on your profile.
Highlights provides an overview of your mentions statistics, including your top outputs, latest mentions, and overall mentions by country or region.
Research Outputs provides a full list of your outputs that have received mentions. You can click 'sort by' to arrange these in order of publication date or attention score. You can also download these results in csv format by clicking 'Export this tab'.
Timeline lets you view all mentions over time, and filter by specific attention source types. You can select multiple attention sources to view in the chart by holding down the Command key on Mac computers, or the Control key on Windows computers.
Demographics shows you maps of geolocations for four attention sources: Twitter, Facebook, News, and Policy. If you click on the name of a country in the table, or the country itself on the map, you will be taken to the Mentions Tab, where you can view all of the mentions originating from that country in the selected attention source.
Mentions lets you view all the individual mentions across all attention sources, or drill into specific time periods (e.g., to view all mentions that occurred during a particular week). Mentions can be filtered by attention source type, mention outlet or author name, country, and mention time. The Mentions Tab also allows you to surface internationally recognized mainstream news attention in the "Show Highlights Only" option.
Mention sources shows you where your mentions are coming from, e.g. which Twitter accounts are tweeting about your work.
Journals shows a comparison table that displays all the total mention counts for different Altmetric attention sources, aggregated by journal. For example, if you were to run a search query for all research outputs published by a specific journal publisher, then you would see all their journals listed in this table, along with the mention counts for each source..
To get notified when a specific research output is mentioned, navigate to anywhere on your Altmetric profile where you can find your outputs (see guidance above) or search for a specific output in the general search box.
Clicking on the title of an output will take you to its Altmetrics Details Page. On the right hand side of this page, you can click 'Alert me about new mentions' to get an email notification of any new mentions for this specific output.
You can also sign up to receive regular reports of Altmetric attention for a specific set of outputs by setting up an email report.
Strengths and weaknesses of Altmetrics tools
Altmetrics Strengths |
Altmetrics Weaknesses |
---|---|
Accumulate more quickly than citation-based metrics | As with any research metric, they cannot be used in isolation to tell the 'whole story' of an output |
Capture a greater diversity of impact than citation-based metrics, and demonstrate the impact of a wider range of outputs (not just journals and books) | Negative attention may lead to a 'higher' altmetric score – information must be placed into context to be meaningful |
Can be used in conjunction with other metrics to create a broader and more comprehensive picture of an output's impact | Unlikely to be comprehensive – for example, Altmetric only picks up mentions which refer to a unique identifier such as a DOI |
Can help to demonstrate impact outside of the academic world | As with many quantitative metrics, altmetrics have the potential to be biased, manipulated, or gamed |
When using any research metric, including altmetrics, it is imperative to be aware of the issues surrounding their improper use. To find out more about how to use measures of impact in a responsible way, visit our guidance on responsible metrics. |
Google Scholar
Google Scholar provides an alternative citation count to Scopus and is openly available. You can search for articles or authors here. Citations to articles are computed and updated automatically as Google Scholar updates. Scholar trawls the web to pick up references to articles and does not require a source to meet the same criteria as Scopus in order to be included. This can result in there being a significant difference between your citation count in Google and on Scopus.
Google Scholar Metrics enable authors to quickly gauge the visibility and influence of recent articles in scholarly publications. Scholar Metrics can be used to browse the top 100 publications by their 5 year h-index and h-median metrics. Scholar Metrics are currently based on our index as it was in June 2020.
Further information on metrics in Google Scholar can be found here.
Incites - note the University does not have a subscription to InCites
As well as SciVal, other tools exist as competitor products. One is InCites which is Clarivate's (Web of Science) equivalent to SciVal and draws on Web of Science data. As all these tools proprietary and 'own' the data on publications within them, their corporate owners can tender the products for assessment exercises:
In the 2014 REF, Panels A&B had access to Scopus citation data. In the 2021 REF, Clarivate Analytics won the contract and Web of Science data was available to REF panels. The 2021 REF stated that any use of citation data would be in accordance with the principles of the Metric Tide to ensure transparency and responsible use.
Leiden
Web of Science citation data has been manipulated by Leiden University to provide independent rankings of universities. Leiden are behind the manifesto published in Nature urging more responsible use of metrics for research evaluation. Limited options for comparing research across subjects and institutions is available for public use:
Publish or Perish
Harzing's Publish or Perish is a tool for manipulating Google Scholar citations. Data should not be compared to Scopus or Web of Science as citation algorithms are very different across all these tools. Publish or Perish is a free tool for evaluating research metrics:
This toolkit is easy to navigate and a good beginners guide to various types of metric.