Calculated from the previous 2 year’s worth of citation data found in the Web of Science database. It gives an approximate measure for the average number of citations articles published in that journal over 2 years have received, in that year (So a 2019 JIF is the average number of citations received in 2019, for articles published in 2017-18). Citations are not weighted, nor can you draw any conclusions from comparing journals across subject boundaries as it will not take into account differences in publication or citation culture.
Journal impact factors are one of the measures by which Journal Citation Reports compares, ranks and evaluates journals. The JCR uses Web of Science data and is published annually with two editions, Social Science and Science. The JCR is available via the Library's A-Z.
e.g. an impact factor of 2.5 means that on average an article has been cited 2.5 times.
For example, the 2019 impact factor of a journal would be calculated as A/B, where:
A = the number of times that all items published in that journal in 2017 and 2018 were cited by indexed publications during 2019.
B = the total number of 'citable items' published by that journal in 2017 and 2018.
There are a number of issues with the JIF:
Further information: https://clarivate.com/webofsciencegroup/essays/impact-factor/
The following journal metrics have been developed by Scopus as alternatives to the JIF:
Calculated from the previous 3 year’s worth of citation data found in the Scopus database. Launched in December 2016, 'Citescore' is similar to the JIF - but is updated monthly as well as annually. It gives an approximate measure for the average number of citations articles published in that journal over 2 years have received in that year (So a 2016 Citescore is the average number of citations received in 2016, for articles published in 2014-15). Citations are not weighted, nor can you draw any conclusions from comparing journals across subject boundaries as it will not take into account differences in publication or citation culture.
Further information on the (2020) updated methodology for Citescore: Scopus Blog June 2020
Calculated from the previous 3 year’s worth of citation data found in the Scopus database. Citations are weighted based upon where they come from (a journal with a higher or lower SJR), and normalised based upon the set of documents which cite its papers, thus providing a ‘classification free’ measure for comparison.
Further information: http://www.scimagojr.com/
Calculated from previous 3 years of citation data found in the Scopus database. A journal’s ‘subject field’ is taken into account, normalising for subject specific citation cultures (average number of citations, amount of indexed literature, speed of publication) to allow an easier comparison of scores for journals between different subject areas.
Further information: https://www.elsevier.com/solutions/scopus/how-scopus-works/metrics
The Journal Citation Reports tool does not include an edition for the arts and humanities. The Scopus journal metrics can offer useful insight into arts and humanities titles. In fields where monographs are the dominant format for scholarly communications, metrics based on journal citation data cannot convey a complete picture of journal impact.
In terms of assessing the quality of arts and humanities journals, there are some tools that can offer an indication. Inclusion in the Arts and Humanities Citation Index in Web of Science can be an indication of quality. Similarly, the European Reference Index for the Humanities and Social Sciences compiles a list called ERIH Plus. It contains titles that meet certain criteria, including; transparent peer-review practices, a valid ISSN and an academic editorial board.
Some disciplines refer to their own journal ranking lists, for example:
CABS (Chartered Association of Business Schools) Academic journal guide. This list is used in business and Economics, it grades journals with a star rating of 1-4 stars. It includes data from other metric tools but doesn't rely on them to come to a final ranking, rather it gives this data to subject experts who consider it alongside other information.
Washington and Lee School of Law. This ranking looks at citations to the last eight years. The aim is to prevent bias in favour of long-published journals and is only concerns with citations to current scholarship. Sources for the citation counts are limited to documents that are found in Westlaw.
Discipline specific lists can be a helpful guide when thinking about where to publish, or what journals to read. It is important to consider the methodologies that inform these lists, and to use them in conjunction with other metric tools.
Aside from impact factor and ranking, journals data can provide wider insights into the impact of a journal, that can help inform decision making. Consider exploring: