Citation gaming induced by bibliometric evaluation: A country-level comparative analysis
Autoři:
Alberto Baccini aff001; Giuseppe De Nicolao aff002; Eugenio Petrovich aff001
Působiště autorů:
Department of Economics and Statistics, University of Siena, Siena, Italy
aff001; Department of Electrical, Computer and Biomedical Engineering, University of Pavia, Pavia, Italy
aff002
Vyšlo v časopise:
PLoS ONE 14(9)
Kategorie:
Research Article
doi:
https://doi.org/10.1371/journal.pone.0221212
Souhrn
It is several years since national research evaluation systems around the globe started making use of quantitative indicators to measure the performance of researchers. Nevertheless, the effects on these systems on the behavior of the evaluated researchers are still largely unknown. For investigating this topic, we propose a new inwardness indicator able to gauge the degree of scientific self-referentiality of a country. Inwardness is defined as the proportion of citations coming from the country over the total number of citations gathered by the country. A comparative analysis of the trends for the G10 countries in the years 2000-2016 reveals a net increase of the Italian inwardness. Italy became, both globally and for a large majority of the research fields, the country with the highest inwardness and the lowest rate of international collaborations. The change in the Italian trend occurs in the years following the introduction in 2011 of national regulations in which key passages of professional careers are governed by bibliometric indicators. A most likely explanation of the peculiar Italian trend is a generalized strategic use of citations in the Italian scientific community, both in the form of strategic author self-citations and of citation clubs. We argue that the Italian case offers crucial insights on the constitutive effects of evaluation systems. As such, it could become a paradigmatic case in the debate about the use of indicators in science-policy contexts.
Klíčová slova:
People and places – Population groupings – Ethnicities – European people – Italian people – Geographical locations – Europe – European Union – Italy – Research and analysis methods – Research assessment – Citation analysis – Bibliometrics – Research quality assessment – Biology and life sciences – Psychology – Behavior – Veterinary science – Veterinary medicine – Social sciences
Zdroje
1. Hicks D. Performance-based university research funding systems. Research Policy. 2012;41(2):251–261. doi: 10.1016/j.respol.2011.09.007
2. Whitley R, Gläser J, editors. The changing governance of the sciences: the advent of research evaluation systems. No. v. 26 in Sociology of the sciences yearbook. Dordrecht, the Netherlands: Springer; 2007.
3. Haustein S, Larivière V. The Use of Bibliometrics for Assessing Research: Possibilities, Limitations and Adverse Effects. In: Welpe IM, Wollersheim J, Ringelhan S, Osterloh M, editors. Incentives and Performance. Cham: Springer International Publishing; 2015. p. 121–139. Available from: http://link.springer.com/10.1007/978-3-319-09785-5_8.
4. Moed HF. Citation analysis in research evaluation. No. v. 9 in Information science and knowledge management. Dordrecht: Springer; 2005.
5. Geuna A, Martin BR. University Research Evaluation and Funding: An International Comparison. Minerva. 2003;41(4):277–304. doi: 10.1023/B:MINE.0000005155.70870.bd
6. Ingwersen P, Larsen B. Influence of a performance indicator on Danish research production and citation impact 2000–12. Scientometrics. 2014;101(2):1325–1344. doi: 10.1007/s11192-014-1291-x
7. Hicks D. Overview of models of performance-based research funding systems. In: Performance-based Funding for Public Research in Tertiary Education Institutions. OECD; 2010. p. 23–52. Available from: https://www.oecd-ilibrary.org/education/performance-based-funding-for-public-research-in-tertiary-education-institutions/overview-of-models-of-performance-based-research-funding-systems_9789264094611-4-en.
8. Edwards MA, Roy S. Academic Research in the 21st Century: Maintaining Scientific Integrity in a Climate of Perverse Incentives and Hypercompetition. Environmental Engineering Science. 2017;34(1):51–61. doi: 10.1089/ees.2016.0223 28115824
9. Butler L. Modifying publication practices in response to funding formulas. Research Evaluation. 2003;12(1):39–46. doi: 10.3152/147154403781776780
10. Butler L. What happens when funding is linked to publication counts? In: Moed HF, Glänzel W, Schmoch U, editors. Handbook of Quantitative Science and Technology Research. Dordrecht: Springer; 2005. p. 389–405.
11. Biagioli M. Watch out for cheats in citation game. Nature. 2016;535(7611):201–201. doi: 10.1038/535201a 27411599
12. van den Besselaar P, Heyman U, Sandström U. Perverse effects of output-based research funding? Butler’s Australian case revisited. Journal of Informetrics. 2017;11(3):905–918. doi: 10.1016/j.joi.2017.05.016
13. Hicks D, Wouters P, Waltman L, de Rijcke S, Rafols I. Bibliometrics: The Leiden Manifesto for research metrics. Nature. 2015;520(7548):429–431. doi: 10.1038/520429a 25903611
14. Rijcke Sd, Wouters PF, Rushforth AD, Franssen TP, Hammarfelt B. Evaluation practices and effects of indicator use—a literature review. Research Evaluation. 2016;25(2):161–169. doi: 10.1093/reseval/rvv038
15. Wouters P. The failure of a paradigm. Journal of Informetrics. 2018;12(2):534–540. doi: 10.1016/j.joi.2018.03.002
16. Dahler-Larsen P. Constitutive Effects of Performance Indicators: Getting beyond unintended consequences. Public Management Review. 2014;16(7):969–986. doi: 10.1080/14719037.2013.770058
17. Biagioli M. Quality to Impact, Text to Metadata: Publication and Evaluation in the Age of Metrics. KNOW: A Journal on the Formation of Knowledge. 2018;2(2):249–275.
18. Felt U, Červinková A. Knowing and living in academic research: convergences and heterogeneity in research cultures in the European context. Prague: Institute of Sociology of the Academy of Sciences of the Czech Republic; 2009.
19. Müller R, de Rijcke S. Thinking with indicators. Exploring the epistemic impacts of academic performance indicators in the life sciences. Research Evaluation. 2017;26(3):157–168. doi: 10.1093/reseval/rvx023
20. Hammarfelt B, de Rijcke S. Accountability in context: effects of research evaluation systems on publication practices, disciplinary norms, and individual working routines in the faculty of Arts at Uppsala University. Research Evaluation. 2015;24(1):63–77. doi: 10.1093/reseval/rvu029
21. Sousa SB, Brennan JL. The UK Research Excellence Framework and the Transformation of Research Production. In: Musselin C, Teixeira PN, editors. Reforming Higher Education. vol. 41. Dordrecht: Springer Netherlands; 2014. p. 65–80. Available from: http://link.springer.com/10.1007/978-94-007-7028-7_4.
22. Fochler M, Felt U, Müller R. Unsustainable Growth, Hyper-Competition, and Worth in Life Science Research: Narrowing Evaluative Repertoires in Doctoral and Postdoctoral Scientists’ Work and Lives. Minerva. 2016;54(2):175–200. doi: 10.1007/s11024-016-9292-y 27340295
23. Gillies D. How should research be organised? London: College Publications; 2008.
24. Laudel G, Gläser J. Beyond breakthrough research: Epistemic properties of research and their consequences for research funding. Research Policy. 2014;43(7):1204–1216. doi: 10.1016/j.respol.2014.02.006
25. Lee FS, Pham X, Gu G. The UK Research Assessment Exercise and the narrowing of UK economics. Cambridge Journal of Economics. 2013;37(4):693–717. doi: 10.1093/cje/bet031
26. Viola M. Evaluation of Research(ers) and its Threat to Epistemic Pluralisms. European journal of analytic philosophy. 2018;13(2):55–78. doi: 10.31820/ejap.13.2.4
27. Broz L, Stöckelová T. The culture of orphaned texts: Academic books in a performance-based evaluation system. Aslib Journal of Information Management. 2018;70(6):623–642. doi: 10.1108/AJIM-03-2018-0063
28. van Dalen HP, Henkens K. Intended and unintended consequences of a publish-or-perish culture: A worldwide survey. Journal of the American Society for Information Science and Technology. 2012;63(7):1282–1293. doi: 10.1002/asi.22636
29. Wilson M, Holligan C. Performativity, work-related emotions and collective research identities in UK university education departments: an exploratory study. Cambridge Journal of Education. 2013;43(2):223–241. doi: 10.1080/0305764X.2013.774321
30. Baccini A, De Nicolao G. Do they agree? Bibliometric evaluation versus informed peer review in the Italian research assessment exercise. Scientometrics. 2016;108(3):1651–1671. doi: 10.1007/s11192-016-1929-y
31. Baccini A. Performance-based incentives, research evaluation systems and the trickle-down of bad science. New York: INET—Institute for New Economic Thinking; 2018. Available from: https://www.ineteconomics.org/uploads/papers/Baccini-Value-for-money-Berlin-final.pdf.
32. Seeber M, Cattaneo M, Meoli M, Malighetti P. Self-citations as strategic response to the use of metrics for career decisions. Research Policy. 2019;48(2):478–491. doi: 10.1016/j.respol.2017.12.004
33. Scarpa F, Bianco V, Tagliafico LA. The impact of the national assessment exercises on self-citation rate and publication venue: an empirical investigation on the engineering academic sector in Italy. Scientometrics. 2018;117(2):997–1022. doi: 10.1007/s11192-018-2913-5
34. Šipka P. Legitimacy of citations in predatory publishing: The case of proliferation of papers by Serbian authors in two Bosnian WoS-indexed journals. CEES Occasional Paper Series. 2012;(2012-12-2).
35. Van Noorden R. Brazilian citation scheme outed. Nature. 2013;500(7464):510–511. doi: 10.1038/500510a 23985850
36. Fister I, Perc M. Toward the Discovery of Citation Cartels in Citation Networks. Frontiers in Physics. 2016;4. doi: 10.3389/fphy.2016.00049
37. Glänzel W, Bart T, Balázs S. A bibliometric approach to the role of author self-citations in scientific communication. Scientometrics. 2004;59(1):63–77. doi: 10.1023/B:SCIE.0000013299.38210.74
38. Snyder H, Bonzi S. Patterns of self-citation across disciplines (1980-1989). Journal of Information Science. 1998;24(6):431–435. doi: 10.1177/016555159802400606
39. Baccini A, Barabesi L. Interlocking Editorship. A Network Analysis of the Links Between Economic Journals. Scientometrics. 2010;82(2):365–389. doi: 10.1007/s11192-009-0053-7
40. Todeschini R, Baccini A. Handbook of Bibliometric Indicators. Quantitative Tools for Studying and Evaluating Research. Weinheim (Germany): Wiley-VCH; 2016.
41. Eto H. Interdisciplinary information input and output of a nano-technology project. Scientometrics. 2003;58(1):5–33. doi: 10.1023/A:1025423406643
42. Elsevier. Research Metrics Guidebook; 2018. Available from: https://www.elsevier.com/__data/assets/pdf_file/0020/53327/ELSV-13013-Elsevier-Research-Metrics-Book-r5-Web.pdf.
43. Tagliacozzo R. Self-Citations in Scientific Literature. Journal of Documentation. 1977;33(4):251–265. doi: 10.1108/eb026644
44. Pichappan P, Sarasvady S. The other side of the coin: The intricacies of author self-citations. Scientometrics. 2002;54(2):285–290. https://doi.org/10.1023/A:1016070029935.
45. Garfield E. Is citation analysis a legitimate evaluation tool? Scientometrics. 1979;1(4):359–375. doi: 10.1007/BF02019306
46. Hyland K. Self-citation and self-reference: Credibility and promotion in academic publication. Journal of the American Society for Information Science and Technology. 2003;54(3):251–259. doi: 10.1002/asi.10204
47. May RM. The Scientific Wealth of Nations. Science. 1997;275(5301):793–796. doi: 10.1126/science.275.5301.793
48. Katz JS. Scale-independent indicators and research evaluation. Science and Public Policy. 2000;27(1):23–36. doi: 10.3152/147154300781782156
49. Merton RK. The sociology of science: theoretical and empirical incvestigations. 4th ed. Chicago: Univ. of Chicago Pr; 1974.
50. Kaplan N. The norms of citation behavior. Prolegomena to the footnote. American Documentation. 1965;16(3):179–187. doi: 10.1002/asi.5090160305
51. Zuckerman H. Citation analysis and the complex problem of intellectual influence. Scientometrics. 1987;12(5-6):329–338. doi: 10.1007/BF02016675
52. Leydesdorff L. Visualization of the citation impact environments of scientific journals: An online mapping exercise. Journal of the American Society for Information Science and Technology. 2007;58(1):25–38. doi: 10.1002/asi.20406
53. Leydesdorff L. Caveats for the use of citation indicators in research and journal evaluations. Journal of the American Society for Information Science and Technology. 2008;59(2):278–287. doi: 10.1002/asi.20743
54. Schubert A, Glänzel W, Thijs B. The weight of author self-citations. A fractional approach to self-citation counting. Scientometrics. 2006;67(3):503–514. doi: 10.1556/Scient.67.2006.3.11
55. Perpiñán Lamigueiro O. Displaying time series, spatial, and space-time data with R. Boca Raton, FL: CRC Press, Taylor & Francis Group; 2015.
56. Garfield E. Citation Analysis as a Tool in Journal Evaluation. Journals can be ranked by frequency and impact of citations for science policy studies. Science. 1972;178(4060):471–479. doi: 10.1126/science.178.4060.471 5079701
57. Nederhof AJ. Bibliometric monitoring of research performance in the Social Sciences and the Humanities: A Review. Scientometrics. 2006;66(1):81–100. doi: 10.1007/s11192-006-0007-2
Článek vyšel v časopise
PLOS One
2019 Číslo 9
- Proč jsou nemocnice nepřítelem spánku? A jak to změnit?
- Dlouhodobá ketodieta může poškozovat naše orgány
- „Jednohubky“ z klinického výzkumu – 2024/42
- Metamizol jako analgetikum první volby: kdy, pro koho, jak a proč?
- MUDr. Jana Horáková: Remise již dosahujeme u více než 80 % pacientů s myastenií