Skip to Main Content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.

Publication Metrics

Responsible metrics- key messages on measuring journal impact

DORA: "assess research on its own merits rather than on the basis of the journal in which the research is published"."

 

Leiden Manifesto: "The impact factor is calculated for journals indexed in the US-based and still mostly English-language Web of Science. These biases are particularly problematic in the social sciences and humanities, in which research is more regionally and nationally engaged."

 

The Metric Tide: "placing too much emphasis on narrow, poorly-designed indicators – such as journal impact factors (JIFs) – can have negative consequences."

Journal Impact Factor

Calculated from the previous 2 year’s worth of citation data found in the Web of Science database. It gives an approximate measure for the average number of citations articles published in that journal over 2 years have received, in that year (So a 2019 JIF is the average number of citations received in 2019, for articles published in 2017-18). Citations are not weighted, nor can you draw any conclusions from comparing journals across subject boundaries as it will not take into account differences in publication or citation culture.

Journal impact factors are one of the measures by which Journal Citation Reports compares, ranks and evaluates journals. The JCR uses Web of Science data and is published annually with two editions, Social Science and Science. The JCR is available via the Library's A-Z. 

The impact factor of a journal is a measure of the frequency of citation of an average article in that journal over a particular period:

e.g. an impact factor of 2.5 means that on average an article has been cited 2.5 times.

Calculation

For example, the 2019 impact factor of a journal would be calculated as A/B, where:

A = the number of times that all items published in that journal in 2017 and 2018 were cited by indexed publications during 2019.

B = the total number of 'citable items' published by that journal in 2017 and 2018.

Practical problems

There are a number of issues with the JIF:

  • Use of impact factors to assess research output is not appropriate for some disciplines, e.g. in arts and humanities
  • Journal metrics can only be compared across the same discipline and are not suitable for measuring inter-disciplinary journals
  • The Impact factor can be affected if a journal
    •  publishes a high number of review articles one year
    • suddenly changes size/title

Further information: https://clarivate.com/webofsciencegroup/essays/impact-factor/

CiteScore, ScImago Journal Rank (SJR) and Source-Normalised Impact per paper (SNIP)

The following journal metrics have been developed by Scopus as alternatives to the JIF: 

Citescore

Calculated from the previous 3 year’s worth of citation data found in the Scopus database. Launched in December 2016, 'Citescore' is similar to the JIF - but is updated monthly as well as annually. It gives an approximate measure for the average number of citations articles published in that journal over 2 years have received in that year (So a 2016 Citescore is the average number of citations received in 2016, for articles published in 2014-15). Citations are not weighted, nor can you draw any conclusions from comparing journals across subject boundaries as it will not take into account differences in publication or citation culture.

Further information on the (2020) updated methodology for Citescore: Scopus Blog June 2020

SCImago Journal Rank (SJR)

Calculated from the previous 3 year’s worth of citation data found in the Scopus database. Citations are weighted based upon where they come from (a journal with a higher or lower SJR), and normalised based upon the set of documents which cite its papers, thus providing a ‘classification free’ measure for comparison.

Further information: http://www.scimagojr.com/

Source Normalised Impact per Paper

Calculated from previous 3 years of citation data found in the Scopus  database. A journal’s ‘subject field’ is taken into account, normalising for subject specific citation cultures (average number of citations, amount of indexed literature, speed of publication) to allow an easier comparison of scores for journals between different subject areas.

Further information: https://www.elsevier.com/solutions/scopus/how-scopus-works/metrics

Arts and Humanities Journals

The Journal Citation Reports tool does not include an edition for the arts and humanities. The Scopus journal metrics can offer useful insight into arts and humanities titles. In fields where monographs are the dominant format for scholarly communications, metrics based on journal citation data cannot convey a complete picture of journal impact.

In terms of assessing the quality of arts and humanities journals, there are some tools that can offer an indication. Inclusion in the Arts and Humanities Citation Index in Web of Science can be an indication of quality. Similarly, the European Reference Index for the Humanities and Social Sciences compiles a list called ERIH Plus. It contains titles that meet certain criteria, including; transparent peer-review practices, a valid ISSN and an academic editorial board.

Discipline specific lists

Some disciplines refer to their own journal ranking lists, for example: 

CABS (Chartered Association of Business Schools) Academic journal guide. This list is used in business and Economics, it grades journals with a star rating of 1-4 stars. It includes data from other metric tools but doesn't reply on them to come to a final ranking, rather it gives this data to subject experts who consider it alongside other information.

Washington and Lee School of Law. This ranking looks at citations to the last eight years. The aim is to prevent bias in favour of long-published journals and is only concerns with citations to current scholarship. Sources for the citation counts are limited to documents that are found in Westlaw.

Discipline specific lists can be a helpful guide when thinking about where to publish, or what journals to read. It is important to consider the methodologies that inform these lists, and to use them in conjunction with other metric tools.

 

What can journal data tell you?

Aside from impact factor and ranking, journals data can provide wider insights into the impact of a journal, that can help inform decision making. Consider exploring:

  • Journals publishing articles which are cited the most
  • Journals whose articles are cited most quickly
  • Journals who have a high number of citations from other journals from the same publisher
  • Journals with a high percentage of articles which are never cited