It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.
Library Research Support & Services: Tracking & Measuring Impact: H Index & other metrics
Bibliometrics refers to the methods and the analysis of these methods of measuring influence or impact in research literature. The extent of use and importance of bibliometrics will vary across different subject areas. There are metrics for authors, publications and journals. Your Information Specialist can provide further guidance.
The H-index is the main author metric. Created by Jorge Hirsch.
"Obviously a single number can never give more than a rough approximation to an individual's multifaceted profile, and many other factors should be considered in combination in evaluating an individual..........especially in life-changing decision such as the granting or denying of tenure." (Hirsch, http://arxiv.org/PS_cache/physics/pdf/0508/0508025v5.pdf)
Databases such as Scopus, Web of Science and Google Scholar can calculate an H-Index. Each databases has their own source list of indexed titles which citations are calculated from so the H index generated by each tool will be different for the same researcher.
Disciplinary differences in citation practice require normalisation and context before assessing 'value.' e.g. the lifespan of articles, the coverage of the research topic within the source plus the rate of publication in a subject area all mean that h indexes only really have a context when compared with others in the same subject.
Early Career Researchers and researchers who have taken career breaks can be disadvantaged when compared against other researchers using the H-Index.
Self citations will increase the h index (some information sources such as Web of Science will indicate self citations).
Negative citations: There can be concern that citation counts can be generated by review and evaluation of the research which could ultimately lead to papers being withdrawn or retracted.
Web of Science and Scopus automatically generate profiles for authors where work is indexed in their databases. See our separate guidance on managing author profiles if you need to make amendments or claim these system generated profiles:
Journal metrics can help you establish whether you are publishing in the most appropriate journal for your research, or could you have greater impact if you published elsewhere?
See our separate guidance on 'identifying where to publish' which covers journal metrics such as Impact Factor (Web of Science's proprietary ranking of journals) and Scopus's equivalent, CiteScore plus other measures for evaluating journal quality.
Web of Science, Scopus, Google Scholar will all detail in their search results, the number of times a paper has been cited by other papers. Each tool will generate a different figure for the same paper due to source data indexing differences.
Databases will attempt to normalise and contextualise these numbers:
Web of Science will identify both 'hot papers in field' (new papers) and 'highly cited in field' in your search hits
Web of Science also states a 'usage count' of the times the record has been viewed in Web of Science
Scopus gives a field-weighted citation for each citation count
Both tools enable ranking of hits by citation count 'high to low' and further, in-depth, analysis of citations:
How to analyse citations in Web of Science:
How to analyse citations in Scopus:
Altmetrics can highlight the attention papers are receiving and can be especially useful for recently published works that have yet to generate traditional cititations. Altmetrics captures mentions from social media sites, newspaper articles, policy documents, television and radio.
Scopus also brings in tweets and blog mentions via its competitor tool, PlumX Metrics, into each record.