Research evaluation: indicating performance or measurement of published research outputs. The Metric Tide (July 2015), commissioned by government, concluded that metrics cannot and should not replace peer review in research assessment exercises. It urges the use of Responsible Metrics (see: The Leiden Manifesto published in Nature and DORA).
The University of Plymouth has access to SciVal, a powerful citations analysis tool which can be used for benchmarking research, assessing the international reach of UoP research as well as identifying potential emerging areas of research, potential collaborators and the value of our current collaborations.
Such citation data can be used in national and global league tables and can help senior managers with strategic planning.
It is important to note that SciVal is based on Scopus data (both Elsevier products) and where work is not published in journals indexed in Scopus, no analysis can be provided on these outputs via SciVal. Other tools exist and as proprietary data, can be tendered for assessment exercises:
In the 2014 REF, Panels A&B had access to Scopus citation data. For the 2021 REF, Clarivate Analytics have won the contract and Web of Science data will be available to REF panels. Clarivate's citation analysis tool is InCites (the University does not have a subscription to InCites). The REF state that any use of citation data will be in accordance with the principles of the Metric Tide to ensure transparency and responsible use.
Web of Science citation data has been manipulated by Leiden University to provide independent rankings of universities. Leiden are behind the manifesto published in Nature urging more responsible use of metrics for research evaluation. Limited options for comparing research across subjects and institutions is available for public use:
Harzing's Publish or Perish is a tool for manipulating Google Scholar and Microsoft Academic Search citations. Data should not be compared to Scopus or Web of Science as citation algorithms are very different across all these tools. Publish or Perish is a free tool for evaluating research metrics:
There is an increasing movement towards the responsible evaluation of research to ensure that metrics are recognised as indicators and not the absolute worth of a person's research endeavours. Citation practice varies across disciplines so care must be taken when ranking research 'impact' across a group of researchers. Use of Impact Factors as a proxy for the quality of a published paper or individual researcher is now frequently contested (see DORA). Universities are also developing their own Responsible Metrics policies in response to DORA, Leiden and now funders, who are positioning Open Access policies in the context of a fairer research culture.
Plan S is the coming together of global funders including Wellcome, EU, UKRI and others with a view to standardising OA policies and accelerating the move to full Open Access. [See our full guidance on Plan S]. Plan S identifies a 'misdirected reward culture' of publishing in prestige journals that may not be aligned to the principles of Openness. They have committed to assessing grant applications based on intrinsic quality of the work and not the venue of publication, its Impact Factor or publisher.
Dr Charles Martinez of Elsevier presented 2 workshops on using SciVal in 2018. UoP staff can access recordings of the workshops:
Contact your Information Specialist or Faculty Development & Partnership Manager (previously, R&I Business Partners) for more advice on SciVal.
There are so many available metrics for so many purposes. Do you want to know about attention or cultural impact or discipline impact? Are you measuring research papers, people, institutions or books? The Metrics Toolkit created by scientometricians is a global endeavour to show real world application of metrics responsibly: