Skip to Main Content
Library Guides

Researcher Support Library Services: Research Assessment, Evaluation, Benchmarking and Reporting

Research evaluation: indicating performance or measurement of published research outputs.  The Metric Tide (July 2015), commissioned by government, concluded that metrics cannot and should not replace peer review in research assessment exercises.  It urges the use of Responsible Metrics (see: The Leiden Manifesto published in Nature and DORA). 

Producing Basic Reports in Elements

Producing Basic Reports in Symplectic Elements

Publication activity recorded in Symplectic Elements can be extracted for reporting purposes.

Producing a report will require sufficient admin privileges. Please contact researchadvice@plymouth.ac.uk if you require the relevant permissions.

Below are some step-by-step instructions on producing a basic report (publications list) in Elements:

Tools for determining indicators or metrics:

Research Evaluation & Benchmarking tools

The University of Plymouth has access to SciVal, a powerful citations analysis tool which can be used for benchmarking research, assessing the international reach of UoP research as well as identifying potential emerging areas of research, potential collaborators and the value of our current collaborations.
Such citation data can be used in national and global league tables and can help senior managers with strategic planning.

Overview of benchmarking tools and caveats for use:

It is important to note that SciVal is based on Scopus data (both Elsevier products) and where work is not published in journals indexed in Scopus, no analysis can be provided on these outputs via SciVal.  Other tools exist and as proprietary data, can be tendered for assessment exercises:

In the 2014 REF, Panels A&B had access to Scopus citation data.  For the 2021 REF, Clarivate Analytics have won the contract and Web of Science data will be available to REF panels.  Clarivate's citation analysis tool is InCites (the University does not have a subscription to InCites).  The REF state that any use of citation data will be in accordance with the principles of the Metric Tide to ensure transparency and responsible use.

Web of Science citation data has been manipulated by Leiden University to provide independent rankings of universities.  Leiden are behind the manifesto published in Nature urging more responsible use of metrics for research evaluation.  Limited options for comparing research across subjects and institutions is available for public use:

Harzing's Publish or Perish is a tool for manipulating Google Scholar and Microsoft Academic Search citations.  Data should not be compared to Scopus or Web of Science as citation algorithms are very different across all these tools.  Publish or Perish is a free tool for evaluating research metrics:

SciVal & SciVal Guides

Image result for scival logo

SciVal overview

Dr Charles Martinez of Elsevier presented 2 workshops on using SciVal in 2018. UoP staff can access recordings of the workshops:

SciVal user guides

Contact your Information Specialist or Faculty Development & Partnership Manager (previously, R&I Business Partners) for more advice on SciVal.

Introductory guides

Advanced guides

Responsible Metrics

Responsible Metrics

When using citation-based metrics or other measures of impact, it is important to be aware of the issues surrounding their improper use. There is an increasing movement towards the responsible evaluation of research to ensure that metrics are recognised as indicators and not the absolute worth of a person's research endeavours. 

To find out more about how to use measures of impact in a responsible way, visit our guidance on responsible metrics.