Citation impact

Citation impact quantifies the citation usage of scholarly works.[1][2][3][4][5] It is a result of citation analysis or bibliometrics. Among the measures that have emerged from citation analysis are the citation counts for an individual article, an author, and an academic journal.

Article-level

One of the most basic citation metrics is how often an article was cited in other articles, books, or other sources (such as theses). Citation rates are heavily dependent on the discipline and the number of people working in that area. For instance, many more scientists work in neuroscience than in mathematics, and neuroscientists publish more papers than mathematicians, hence neuroscience papers are much more often cited than papers in mathematics.[6][7] Similarly, review papers are more often cited than regular research papers because they summarize results from many papers. This may also be the reason why papers with shorter titles get more citations, given that they are usually covering a broader area.[8]

Most-cited papers

The most-cited paper of all time is a paper by Oliver Lowry describing an assay to measure the concentration of proteins.[9] By 2014 it had accumulated more than 305,000 citations. The 10 most cited papers all had more than 40,000 citations.[10] To reach the top-100 papers required 12,119 citations by 2014.[10] Of Thomson Reuter’s Web of Science database with more than 58 million items only 14,499 papers (~0.026%) had more than 1,000 citations in 2014.[10]

Journal-level

Journal impact factors are influenced heavily by a small number of highly cited papers. In general, most papers published in 2013–14 received many fewer citations than indicated by the impact factor. Two journals (Nature [blue], PLOS One [orange]) are shown to represent a highly cited and less cited journal, respectively. Note that the high citation impact of Nature is derived from relatively few highly cited papers. Modified after Callaway 2016.[11]

Journal impact factors (JIFs) are a measure of the average number of citations that articles published by a journal in the previous two years have received in the current year. However, journals with very high impact factors are often based on a small number of very highly cited papers. For instance, most papers in Nature (impact factor 38.1, 2016) were "only" cited 10 or 20 times during the reference year (see figure). Journals with a "low" impact (e.g. PLOS One, impact factor 3.1) publish many papers that are cited 0 to 5 times but few highly cited articles.[11]

JIFs are often mis-interpreted as a measure for journal quality or even article quality. The JIF is a journal-level metric, not an article-level metric, hence its use to determine the impact of a single article is statistically invalid. Citation distribution is skewed for journals because a very small number of articles is driving the vast majority of citations (see figure). Therefore, some journals have stopped publicizing their impact factor, e.g. the journals of the American Society for Microbiology.[12]

Author-level

Total citations, or average citation count per article, can be reported for an individual author or researcher. Many other measures have been proposed, beyond simple citation counts, to better quantify an individual scholar's citation impact.[13] The best-known measures include the h-index[14] and the g-index.[15] Each measure has advantages and disadvantages,[16] spanning from bias to discipline-dependence and limitations of the citation data source.[17] Counting the number of citations per paper is also employed to identify the authors of citation classics.[18]

Alternatives

An alternative approach to measure a scholar's impact relies on usage data, such as number of downloads from publishers and analyzing citation performance, often at article level.[19][20][21][22]

As early as 2004, the BMJ published the number of views for its articles, which was found to be somewhat correlated to citations.[23] In 2008 the Journal of Medical Internet Research began publishing views and Tweets. These "tweetations" proved to be a good indicator of highly cited articles, leading the author to propose a "Twimpact factor", which is the number of Tweets it receives in the first seven days of publication, as well as a Twindex, which is the rank percentile of an article's Twimpact factor.[24]

In response to growing concerns over the inappropriate use of journal impact factors in evaluating scientific outputs and scientists themselves, Université de Montréal, Imperial College London, PLOS, eLife, EMBO Journal, The Royal Society, Nature and Science proposed citation distributions metrics as alternative to impact factors.[25][26][27]

Open Access publications

Open access (OA) publications are available without cost to readers, hence they should be cited more frequently.[28][29][30][31][32][33][34][35] While some experimental and observational studies have found that articles published in OA journals do not receive more citations, on average, than those published in subscription journals,[36][37] one recent study determined that OA journals have significantly more citations overall compared to non-OA journals (median 15.5 vs 12).[38]

Recent developments

An important recent development in research on citation impact is the discovery of universality, or citation impact patterns that hold across different disciplines in the sciences, social sciences, and humanities. For example, it has been shown that the number of citations received by a publication, once properly rescaled by its average across articles published in the same discipline and in the same year, follows a universal log-normal distribution that is the same in every discipline.[39] This finding has suggested a universal citation impact measure that extends the h-index by properly rescaling citation counts and resorting publications, however the computation of such a universal measure requires the collection of extensive citation data and statistics for every discipline and year. Social crowdsourcing tools such as Scholarometer have been proposed to address this need.[40][41] Kaur et al. proposed a statistical method to evaluate the universality of citation impact metrics, i.e., their capability to compare impact fairly across fields.[42] Their analysis identifies universal impact metrics, such as the field-normalized h-index.

Research suggests the impact of an article can be, partly, explained by superficial factors and not only by the scientific merits of an article.[43] Field-dependent factors are usually listed as an issue to be tackled not only when comparison across disciplines are made, but also when different fields of research of one discipline are being compared.[44] For instance in Medicine among other factors the number of authors, the number of references, the article length, and the presence of a colon in the title influence the impact. Whilst in Sociology the number of references, the article length, and title length are among the factors.[45] Also it is found that scholars engage in ethical questionable behavior in order to inflate the number of citations articles receive.[46]

Automated citation indexing[47] has changed the nature of citation analysis research, allowing millions of citations to be analyzed for large scale patterns and knowledge discovery. The first example of automated citation indexing was CiteSeer, later to be followed by Google Scholar. More recently, advanced models for a dynamic analysis of citation aging have been proposed.[48][49] The latter model is even used as a predictive tool for determining the citations that might be obtained at any time of the lifetime of a corpus of publications.

According to Mario Biagioli: "All metrics of scientific evaluation are bound to be abused. Goodhart's law [...] states that when a feature of the economy is picked as an indicator of the economy, then it inexorably ceases to function as that indicator because people start to game it."[50]

See also

References

  1. Garfield, E. (1955). "Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas". Science. 122 (3159): 108–111. Bibcode:1955Sci...122..108G. doi:10.1126/science.122.3159.108. PMID 14385826.
  2. Garfield, E. (1973). "Citation Frequency as a Measure of Research Activity and Performance" (PDF). Essays of an Information Scientist. 1: 406–408.
  3. Garfield, E. (1988). "Can Researchers Bank on Citation Analysis?" (PDF). Essays of an Information Scientist. 11: 354.
  4. Garfield, E. (1998). "The use of journal impact factors and citation analysis in the evaluation of science". 41st Annual Meeting of the Council of Biology Editors.
  5. Moed, Henk F. (2005). Citation Analysis in Research Evaluation. Springer. ISBN 978-1-4020-3713-9.
  6. de Solla Price, D. J. (1963). Little Science, Big Science. Columbia University Press.
  7. Larsen, P. O.; von Ins, M. (2010). "The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index". Scientometrics. 84 (3): 575–603. doi:10.1007/s11192-010-0202-z. PMC 2909426. PMID 20700371.
  8. Deng, B. (26 August 2015). "Papers with shorter titles get more citations". Nature News. doi:10.1038/nature.2015.18246.
  9. Lowry, O. H.; Rosebrough, N. J.; Farr, A. L.; Randall, R. J. (1951). "Protein measurement with the Folin phenol reagent". The Journal of Biological Chemistry. 193 (1): 265–275. PMID 14907713.
  10. van Noorden, R.; Maher, B.; Nuzzo, R. (2014). "The top 100 papers". Nature. 514 (7524): 550–553. Bibcode:2014Natur.514..550V. doi:10.1038/514550a. PMID 25355343.
  11. Callaway, E. (2016). "Beat it, impact factor! Publishing elite turns against controversial metric". Nature. 535 (7611): 210–211. Bibcode:2016Natur.535..210C. doi:10.1038/nature.2016.20224. PMID 27411614.
  12. Casadevall, A.; Bertuzzi, S.; Buchmeier, M. J.; Davis, R. J.; Drake, H.; Fang, F. C.; Gilbert, J.; Goldman, B. M.; Imperiale, M. J. (2016). "ASM Journals Eliminate Impact Factor Information from Journal Websites". mSphere. 1 (4): e00184–16. doi:10.1128/mSphere.00184-16. PMC 4941020. PMID 27408939.
  13. Belikov, A. V.; Belikov, V. V. (2015). "A citation-based, author- and age-normalized, logarithmic index for evaluation of individual researchers independently of publication counts". F1000Research. 4: 884. doi:10.12688/f1000research.7070.1. PMC 4654436.
  14. Hirsch, J. E. (2005). "An index to quantify an individual's scientific research output". PNAS. 102 (46): 16569–16572. arXiv:physics/0508025. Bibcode:2005PNAS..10216569H. doi:10.1073/pnas.0507655102. PMC 1283832. PMID 16275915.
  15. Egghe, L. (2006). "Theory and practise of the g-index". Scientometrics. 69 (1): 131–152. doi:10.1007/s11192-006-0144-7. hdl:1942/981.
  16. Gálvez RH (March 2017). "Assessing author self-citation as a mechanism of relevant knowledge diffusion". Scientometrics. 111 (3): 1801–1812. doi:10.1007/s11192-017-2330-1.
  17. Couto, F. M.; Pesquita, C.; Grego, T.; Veríssimo, P. (2009). "Handling self-citations using Google Scholar". Cybermetrics. 13 (1): 2. Archived from the original on 2010-06-24. Retrieved 2009-05-27.
  18. Serenko, A.; Dumay, J. (2015). "Citation classics published in knowledge management journals. Part I: Articles and their characteristics" (PDF). Journal of Knowledge Management. 19 (2): 401–431. doi:10.1108/JKM-06-2014-0220.
  19. Bollen, J.; Van de Sompel, H.; Smith, J.; Luce, R. (2005). "Toward alternative metrics of journal impact: A comparison of download and citation data". Information Processing and Management. 41 (6): 1419–1440. arXiv:cs.DL/0503007. Bibcode:2005IPM....41.1419B. doi:10.1016/j.ipm.2005.03.024.
  20. Brody, T.; Harnad, S.; Carr, L. (2005). "Earlier Web Usage Statistics as Predictors of Later Citation Impact". Journal of the Association for Information Science and Technology. 57 (8): 1060. arXiv:cs/0503020. Bibcode:2005cs........3020B. doi:10.1002/asi.20373.
  21. Kurtz, M. J.; Eichhorn, G.; Accomazzi, A.; Grant, C.; Demleitner, M.; Murray, S. S. (2004). "The Effect of Use and Access on Citations". Information Processing and Management. 41 (6): 1395–1402. arXiv:cs/0503029. Bibcode:2005IPM....41.1395K. doi:10.1016/j.ipm.2005.03.010.
  22. Moed, H. F. (2005b). "Statistical Relationships Between Downloads and Citations at the Level of Individual Documents Within a Single Journal". Journal of the American Society for Information Science and Technology. 56 (10): 1088–1097. doi:10.1002/asi.20200.
  23. Perneger, T. V. (2004). "Relation between online "hit counts" and subsequent citations: Prospective study of research papers in the BMJ". BMJ. 329 (7465): 546–7. doi:10.1136/bmj.329.7465.546. PMC 516105. PMID 15345629.
  24. Eysenbach, G. (2011). "Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact". Journal of Medical Internet Research. 13 (4): e123. doi:10.2196/jmir.2012. PMC 3278109. PMID 22173204.
  25. Veronique Kiermer (2016). "Measuring Up: Impact Factors Do Not Reflect Article Citation Rates". The Official PLOS Blog.
  26. "Ditching Impact Factors for Deeper Data". The Scientist. Retrieved 2016-07-29.
  27. "Scientific publishing observers and practitioners blast the JIF and call for improved metrics". Physics Today. 2016. doi:10.1063/PT.5.8183.
  28. Bibliography of Findings on the Open Access Impact Advantage
  29. Brody, T.; Harnad, S. (2004). "Comparing the Impact of Open Access (OA) vs. Non-OA Articles in the Same Journals". D-Lib Magazine. 10: 6.
  30. Eysenbach, G.; Tenopir, C. (2006). "Citation Advantage of Open Access Articles". PLOS Biology. 4 (5): e157. doi:10.1371/journal.pbio.0040157. PMC 1459247. PMID 16683865.
  31. Eysenbach, G. (2006). "The Open Access Advantage". Journal of Medical Internet Research. 8 (2): e8. doi:10.2196/jmir.8.2.e8. PMC 1550699. PMID 16867971.
  32. Hajjem, C.; Harnad, S.; Gingras, Y. (2005). "Ten-Year Cross-Disciplinary Comparison of the Growth of Open Access and How It Increases Research Citation Impact" (PDF). IEEE Data Engineering Bulletin. 28 (4): 39–47. arXiv:cs/0606079. Bibcode:2006cs........6079H.
  33. Lawrence, S. (2001). "Free online availability substantially increases a paper's impact". Nature. 411 (6837): 521. Bibcode:2001Natur.411..521L. doi:10.1038/35079151. PMID 11385534.
  34. MacCallum, C. J.; Parthasarathy, H. (2006). "Open Access Increases Citation Rate". PLOS Biology. 4 (5): e176. doi:10.1371/journal.pbio.0040176. PMC 1459260. PMID 16683866.
  35. Gargouri, Y.; Hajjem, C.; Lariviere, V.; Gingras, Y.; Brody, T.; Carr, L.; Harnad, S. (2010). "Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality Research". PLOS One. 5 (10): e13636. arXiv:1001.0361. Bibcode:2010PLoSO...513636G. doi:10.1371/journal.pone.0013636. PMC 2956678. PMID 20976155.
  36. Davis, P. M.; Lewenstein, B. V.; Simon, D. H.; Booth, J. G.; Connolly, M. J. L. (2008). "Open access publishing, article downloads, and citations: randomised controlled trial". BMJ. 337: a568. doi:10.1136/bmj.a568. PMC 2492576. PMID 18669565.
  37. Davis, P. M. (2011). "Open access, readership, citations: a randomized controlled trial of scientific journal publishing". The FASEB Journal. 25 (7): 2129–2134. doi:10.1096/fj.11-183988. PMID 21450907.
  38. Chua, SK; Qureshi, Ahmad M; Krishnan, Vijay; Pai, Dinker R; Kamal, Laila B; Gunasegaran, Sharmilla; Afzal, MZ; Ambawatta, Lahiru; Gan, JY (2017-03-02). "The impact factor of an open access journal does not contribute to an article's citations". F1000Research. 6: 208. doi:10.12688/f1000research.10892.1. PMC 5464220. PMID 28649365.
  39. Radicchi, F.; Fortunato, S.; Castellano, C. (2008). "Universality of citation distributions: Toward an objective measure of scientific impact". PNAS. 105 (45): 17268–17272. arXiv:0806.0974. Bibcode:2008PNAS..10517268R. doi:10.1073/pnas.0806977105. PMC 2582263. PMID 18978030.
  40. Hoang, D.; Kaur, J.; Menczer, F. (2010). "Crowdsourcing Scholarly Data" (PDF). Proceedings of the WebSci10: Extending the Frontiers of Society On-Line.
  41. Kaur, J.; Hoang, D.; Sun, X.; Possamai, L.; JafariAsbagh, M.; Patil, S.; Menczer, F. (2012). "Scholarometer: A Social Framework for Analyzing Impact across Disciplines". PLOS One. 7 (9): e43235. Bibcode:2012PLoSO...743235K. doi:10.1371/journal.pone.0043235. PMC 3440403. PMID 22984414.
  42. Kaur, J.; Radicchi, F.; Menczer, F. (2013). "Universality of scholarly impact metrics". Journal of Informetrics. 7 (4): 924–932. arXiv:1305.6339. doi:10.1016/j.joi.2013.09.002.
  43. Bornmann, L.; Daniel, H. D. (2008). "What do citation counts measure? A review of studies on citing behavior". Journal of Documentation. 64 (1): 45–80. doi:10.1108/00220410810844150. hdl:11858/00-001M-0000-0013-7A94-3.
  44. Anauati, M. V.; Galiani, S.; Gálvez, R. H. (2014). "Quantifying the Life Cycle of Scholarly Articles Across Fields of Economic Research". SSRN 2523078. Cite journal requires |journal= (help)
  45. van Wesel, M.; Wyatt, S.; ten Haaf, J. (2014). "What a difference a colon makes: how superficial factors influence subsequent citation" (PDF). Scientometrics. 98 (3): 1601–1615. doi:10.1007/s11192-013-1154-x. hdl:20.500.11755/2fd7fc12-1766-4ddd-8f19-1d2603d2e11d.
  46. van Wesel, M. (2016). "Evaluation by Citation: Trends in Publication Behavior, Evaluation Criteria, and the Strive for High Impact Publications". Science and Engineering Ethics. 22 (1): 199–225. doi:10.1007/s11948-015-9638-0. PMC 4750571. PMID 25742806.
  47. Giles, C. L.; Bollacker, K.; Lawrence, S. (1998). "CiteSeer: An Automatic Citation Indexing System". DL'98 Digital Libraries, 3rd ACM Conference on Digital Libraries. pp. 89–98. doi:10.1145/276675.276685.
  48. Yu, G.; Li, Y.-J. (2010). "Identification of referencing and citation processes of scientific journals based on the citation distribution model". Scientometrics. 82 (2): 249–261. doi:10.1007/s11192-009-0085-z.
  49. Bouabid, H. (2011). "Revisiting citation aging: A model for citation distribution and life-cycle prediction". Scientometrics. 88 (1): 199–211. doi:10.1007/s11192-011-0370-5.
  50. Biagioli, M. (2016). "Watch out for cheats in citation game". Nature. 535 (7611): 201. Bibcode:2016Natur.535..201B. doi:10.1038/535201a. PMID 27411599.

Further reading

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.