Skip to Main Content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.

Bibliometrics & Research Impact: Responsible Metrics

Responsible use of metrics

Bibliometrics provide useful quantitative measures of citation impact but do not on their own provide a complete picture of research impact. 

There is increasing recognition worldwide of the importance of responsible use of bibliometrics in research assessment. A number of frameworks have emerged to assist with the move to responsible use of research metrics, amongst them ‘The Metric Tide’, the ‘Leiden Manifesto for Research Metrics’, and most notably ‘The San Francisco Declaration on Research Assessment’ (DORA). 

Therefore it is important that we use bibliometrics responsibly. Some things to keep in mind are:

  • Citation patterns can vary widely between research fields  so we need to compare like with like
  • Don’t just use one citation tool 

Remember:

  • Coverage varies in content, depth, discipline. No citation tool is comprehensive as they don’t index all publications or research areas.
  • Some disciplines don’t publish as much in journals
  • Data needs to be looked at in context. Use a variety of metrics, data & other qualitative information where appropriate.
  • A citation does not have to be positive.
     

DORA: The San Francisco Declaration on Research Assessment

Declaration on Research Assessment (DORA) logo

San Francisco Declaration on Research Assessment - DORA

The San Francisco Declaration on Research Assessment (DORA) is a worldwide initiative covering all scholarly disciplines. The signatories are concerned about the increasing use of the Journal Impact Factor as a tool to evaluate and compare the research output of individuals and institutions.

Initiated in 2012, the American Society for Cell Biology and a group of editors and publishers of scholarly journals drafted and circulated a declaration that recognises the need to improve the way in which the outputs of scientific research are evaluated.

"The Journal Impact Factor, as calculated by Thomson Reuters, was originally created as a tool to help librarians identify journals to purchase, not as a measure of the scientific quality of research in an article. With that in mind, it is critical to understand that the Journal Impact Factor has a number of well-documented deficiencies as a tool for research assessment. These limitations include:

  • citation distributions within journals are highly skewed
  • the properties of the Journal Impact Factor are field-specific: it is a composite of multiple, highly diverse article types, including primary research papers and reviews
  • Journal Impact Factors can be manipulated (or "gamed") by editorial policy
  • data used to calculate the Journal Impact Factors are neither transparent nor openly available to the public."

 

Many funding agencies are now DORA signatories, including Science Foundation Ireland and the HRB.

The Leiden Manifesto

Published in Nature in April 2015 by five experts led by Diana Hicks, professor in the School of Public Policy at Georgia Institute of Technology, and Paul Wouters, director of CWTS at Leiden University,  The Leiden manifesto proposed 10 principles for the measurement of research performance

The ten principles of the Leiden Manifesto are as follows:

  1. Quantitative evaluation should support qualitative, expert assessment.
  2. Measure performance against the research missions of the institution, group, or researcher.
  3. Protect excellence in locally relevant research.
  4. Keep data collection and analytical processes open, transparent, and simple.
  5. Allow those evaluated to verify data and analysis.
  6. Account for variation by field in publication and citation practices.
  7. Base assessment of individual researchers on a qualitative judgement of their portfolio.
  8. Avoid misplaced concreteness and false precision.
  9. Recognize the systemic effects of assessment and indicators.
  10. Scrutinize indicators regularly and update them.

Useful Resources on Responsible Research Metrics