Journal ranking

Journal ranking is widely used in academic circles in the evaluation of an academic journal's impact and quality. Journal rankings are intended to reflect the place of a journal within its field, the relative difficulty of being published in that journal, and the prestige associated with it. They have been introduced as official research evaluation tools in several countries.

Measures

Traditionally, journal ranking "measures" or evaluations have been provided simply through institutional lists established by academic leaders or through committee vote. These approaches have been notoriously politicized and inaccurate reflections of actual prestige and quality, as they would often reflect the biases and personal career objectives of those involved in ranking the journals; also causing the problem of highly disparate evaluations across institutions.[1][2] Consequently, many institutions have required external sources of evaluation of journal quality. The traditional approach here has been through surveys of leading academics in a given field, but this approach too has potential for bias, though not as profound as that seen with institution-generated lists.[2] Consequently, governments, institutions, and leaders in scientometric research have turned to a litany of observed bibliometric measures on the journal-level that can be used as surrogates for quality and thus eliminate the need for subjective assessment.[3]

Consequently, several journal-level metrics have been proposed, most citation-based:

  • Impact factor – reflecting the average number of citations to articles published in science and social science journals.
  • Eigenfactor – a rating of the total importance of a scientific journal according to the number of incoming citations, with citations from highly ranked journals weighted to make a larger contribution to the eigenfactor than those from poorly ranked journals.
  • SCImago Journal Rank – a measure of scientific influence of scholarly journals that accounts for both the number of citations received by a journal and the importance or prestige of the journals where such citations come from.
  • h-index – usually used as a measure of scientific productivity and the scientific impact of an individual scientist, but can also be used to rank journals.
  • Expert survey – a score reflecting the overall quality or contribution of a journal is based on the results of the survey of active field researchers, practitioners and students (i.e., actual journal contributors or readers), who rank each journal based on specific criteria.[4]
  • Publication power approach (PPA) – the ranking position of each journal is based on the actual publishing behavior of leading tenured academics over an extended time period. As such, the journal's ranking position reflects the frequency at which these scholars published their articles in this journal.[5][6]
  • Altmetrics – rate journals based on scholarly references added to academic social media sites.[7]
  • diamScore – a measure of scientific influence of academic journals based on recursive citation weighting and the pairwise comparisons between journals.[8]
  • Source normalized impact per paper (SNIP) – a factor released in 2012 by Elsevier based on Scopus to estimate impact.[9] The measure is calculated as SNIP=RIP/(R/M), where RIP=raw impact per paper, R = citation potential and M = median database citation potential.[10]
  • PageRank – an 1976 a recursive impact factor that gives citations from journals with high impact greater weight than citations from low-impact journals was proposed.[11] Such a recursive impact factor resembles Google's PageRank algorithm, though the original Pinski and Narin paper uses a "trade balance" approach in which journals score highest when they are often cited but rarely cite other journals; several scholars have proposed related approaches.[12][13][14] In 2006, Johan Bollen, Marko A. Rodriguez, and Herbert Van de Sompel also proposed replacing impact factors with the PageRank algorithm.[15] The Eigenfactor is another PageRank-type measure of journal influence,[16] with rankings freely available online, along with SCImago.[17]
  • JRank – JournalsRanking (JRank) is the digital portal developed by iMaQ Technologies Pvt. Ltd in 2015 containing list of all international journals indexed in ISI-JCR and Scopus-SJR based on the current impact factor (IF) and Quartiles (Q) given by Thomson Reuters and Scopus, respectively. The JRank also gives detailed information about the journal such as country of journal publishing, impact factor history, frequency of journal publishing, active web link etc. All lists of journals based on subjects can also be viewed using JRank portal[18]
  • h5-index – this metric, calculated and released by Google Scholar, is based on the h-index of all articles published in a given journal in the last five years.[19]
  • NCPPU (Net cost per paid use): a revised measure of the cost per download in a certain institution, which combines various measures of value; offered by Unpaywall Journals and used by library systems such as the SUNY Libraries Consortium to select the top 248 most useful journals to subscribe to from Elsevier's offering instead of its big deal.[20]

Critical reflection

Negative consequences of rankings are generally well-documented and relate to the performativity of using journal rankings for performance measurement purposes.[21][22] For example, McKinnon (2017) has analyzed how the ABS-AJG ranking, which in spite of its methodological shortcomings is widely accepted in British business schools, has had negative consequences for the transportation and logistics management disciplines.[23] Universities now increasingly drop the idea that research quality can be measured based on the uni-dimensional scale of a journal ranking. This has, for example, led to the San Francisco Declaration on Research Assessment (DORA), which has now been signed by thousands of researchers worldwide, asking “not [to] use journal-based metrics […] as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions”.[24] The Community for Responsible Research in Business Management (cRRBM) asks whether “even the academy is being served when faculty members are valued for the quantity and placement of their articles, not for the benefit their research can have for the world”.[25]

National rankings

Several national and international rankings of journals exist, e.g.:

They have been introduced as official research evaluation tools in several countries.[35]

  • DHET List of Approved South African Journals
  • International: Scimago[36]

See also

References

  1. Lowry, Paul Benjamin; Moody, Gregory D.; Gaskin, James; Galletta, Dennis F.; Humpherys, Sean; Barlow, Jordan B.; and Wilson, David W. (2013). Evaluating journal quality and the Association for Information Systems (AIS) Senior Scholars' journal basket via bibliometric measures: Do expert journal assessments add value?," MIS Quarterly (MISQ), vol. 37(4), 993–1012. Also, see YouTube video narrative of this paper at: https://www.youtube.com/watch?v=LZQIDkA-ke0.
  2. Lowry, Paul Benjamin; Romans, Denton; and Curtis, Aaron (2004). "Global journal prestige and supporting disciplines: A scientometric study of information systems journals," Journal of the Association for Information Systems (JAIS), vol. 5(2), pp. 29–80 (http://aisel.aisnet.org/amcis2005/276)
  3. Lowry, Paul Benjamin; Moody, Gregory D.; Gaskin, James; Galletta, Dennis F.; Humpherys, Sean; Barlow, Jordan B.; and Wilson, David W. (2013). "Evaluating journal quality and the Association for Information Systems (AIS) Senior Scholars' journal basket via bibliometric measures: Do expert journal assessments add value?," MIS Quarterly (MISQ), vol. 37(4), 993–1012. Also, see YouTube video narrative of this paper at: https://www.youtube.com/watch?v=LZQIDkA-ke0.
  4. Serenko A., Dohan M. , "Comparing the expert survey and citation impact journal ranking methods: Example from the field of Artificial Intelligence", Journal of Informetrics, 5(4), 629-648, 2011
  5. Holsapple, C.W., " A Publication Power Approach for identifying premier information systems journals", Journal of the American Society for Information Science and Technology, 59(2), 166-185, 2008
  6. Serenko, A., Jiao, C., "Investigating information systems research in Canada", Canadian Journal of Administrative Sciences, 29(1), 3-24, 2012
  7. Alhoori, Hamed; Furuta, Richard (2013). Can Social Reference Management Systems Predict a Ranking of Scholarly Venues?. Research and Advanced Technology for Digital Libraries. Lecture Notes in Computer Science. 8092. pp. 138–143. CiteSeerX 10.1.1.648.3770. doi:10.1007/978-3-642-40501-3_14. ISBN 978-3-642-40500-6.
  8. Cornillier, F., Charles, V., "Measuring the attractiveness of academic journals: A direct influence aggregation model", Operations Research Letters, 43(2), 172–176, 2015
  9. "Elsevier Announces Enhanced Journal Metrics SNIP and SJR Now Available in Scopus". Press release. Elsevier. Retrieved 2014-07-27.
  10. Moed, Henk (2010). "Measuring contextual citation impact of scientific journals". Journal of Informetrics. 4 (3): 256–27 7. arXiv:0911.2632. doi:10.1016/j.joi.2010.01.002.
  11. Gabriel Pinski; Francis Narin (1976). "Citation influence for journal aggregates of scientific publications: Theory with application to literature of physics". Information Processing & Management. 12 (5): 297–312. doi:10.1016/0306-4573(76)90048-0.
  12. S. J. Liebowitz; J. P. Palmer. (1984). "Assessing the relative impacts of economics journals" (PDF). Journal of Economic Literature. 22 (1): 77–88. JSTOR 2725228.
  13. I. Palacios-Huerta; O. Volij (2004). "The Measurement of Intellectual Influence". Econometrica. 72 (3): 963–977. CiteSeerX 10.1.1.165.6602. doi:10.1111/j.1468-0262.2004.00519.x.
  14. Y. K. Kodrzycki; P. D. Yu (2006). "New Approaches to Ranking Economics Journals". Contributions to Economic Analysis & Policy. 5 (1). CiteSeerX 10.1.1.178.7834. doi:10.2202/1538-0645.1520.
  15. Johan Bollen, Marko A. Rodriguez, and Herbert Van de Sompel.; Rodriguez; Van De Sompel (December 2006). Journal Status. Scientometrics. 69. pp. 669–687. arXiv:cs.GL/0601030. Bibcode:2006cs........1030B. doi:10.1145/1255175.1255273. ISBN 9781595936448.CS1 maint: multiple names: authors list (link)
  16. C. T. Bergstrom. (May 2007). "Eigenfactor: Measuring the value and prestige of scholarly journals". College & Research Libraries News. 68 (5). Archived from the original on 2010-12-09.
  17. Jevin D. West. "eigenfactor.org". eigenfactor.org. Retrieved 2014-05-18.
  18. List of ISI and Scopus Indexed Journals (2015)""
  19. Minasny, Budiman; Hartemink, Alfred E.; McBratney, Alex; Jang, Ho-Jun (2013-10-22). "Citations and thehindex of soil researchers and journals in the Web of Science, Scopus, and Google Scholar". PeerJ. 1: e183. doi:10.7717/peerj.183. ISSN 2167-8359. PMC 3807595. PMID 24167778.
  20. Denise Wolfe (2020-04-07). "SUNY Negotiates New, Modified Agreement with Elsevier - Libraries News Center University at Buffalo Libraries". library.buffalo.edu. University at Buffalo. Retrieved 2020-04-18.
  21. Espeland & Sauder, 2007, https://doi.org/10.1086/517897
  22. Grant & Kovács, 2018, https://doi.org/10.1108/EBR-12-2016-0155
  23. McKinnon, A.C. (2017). Starry-eyed II: The Logistics Journal Ranking Debate Revisited. International Journal of Physical Distribution & Logistics Management, 47 (6). DOI: 10.1108/IJPDLM-02-2017-0097
  24. https://sfdora.org/
  25. https://bized.aacsb.edu/articles/2018/05/the-moral-dilemma-to-business-research
  26. Australian Research Council ranking of journals worldwide Archived 2011-06-12 at the Wayback Machine
  27. Danish Ministry of Higher Education and Science (2014)""
  28. Publication Forum ""
  29. "Publiseringskanaler - NSD - Norsk senter for forskningsdata". Retrieved 10 December 2016.
  30. ANVUR Riviste di classe A
  31. "Academic Journal Guide 2015 - Chartered Association of Business Schools". Retrieved 10 December 2016.
  32. "List of HEC Recognized Journals". Retrieved 10 December 2016.
  33. NAAS Journal Scoring
  34. "Polish Ministry of Higher Education and Science (2019)". www.bip.nauka.gov.pl. Retrieved 2019-10-12.
  35. Pontille D., Torny D. , "The controversial policies of journal ratings: evaluating social sciences and humanities", Research Evaluation, 19(5), 347-360, 2010
  36. Journal & Country Rank ""
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.