1. Geuna A. The changing rationale for European university research funding. Journal of economic issues. 2001;35: 607–632.
2. Stephan P. The Endless Frontier: Reaping What Bush Sowed? In: Jaffe AB, Jones BF, editors. The changing frontier. Rethinking Science and Innovation Policy. Chicago: Chicago University Press; 2013. pp. 321–370.
3. Hicks D. Performance-based university research funding systems. Research policy. 2012;41: 251–261.
4. Dasgupta P, David PA. Toward a new economics of science. Research policy. 1994;23: 487–521.
5. Slaughter S, Leslie L. Academic Capitalism: Politics, Policies, and the Entrepreneurial University. Baltimore: Johns Hopkins University Press; 1997.
6. Leydesdorff L, Wouters P, Bornmann L. Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report. Scientometrics. 2016;109: 2129–2150. doi: 10.1007/s11192-016-2150-8 27942086
7. Van Raan AF. Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics. 2005;62: 133–143.
8. Marginson S, Van der Wende M. To rank or to be ranked: The impact of global rankings in higher education. Journal of studies in international education. 2007;11: 306–329.
9. Shin JC, Toutkoushian RK, Teichler U. University rankings: Theoretical basis, methodology and impacts on global higher education: Springer Science & Business Media; 2011.
10. Taylor BJ, Cantwell B. Global competition, US research universities, and international doctoral education: Growth and consolidation of an organizational field. Research in Higher Education. 2015;56: 411–441.
11. Hazelkorn E. Rankings and the battle for world-class excellence: institutional strategies and policy choices. Higher Education Management and Policy. 2009;21/1.
12. Bonaccorsi A, Cicero T, Haddawy P, Hassan S. Explaining the transatlantic gap in research excellence. Scientometrics. 2017;110: 217–241.
13. Paradeise C, Thoenig J. Academic Institutions in Search of Quality: Local Orders and Global Standards. Organ Stud. 2013;34: 189–218.
14. Merton RK. The Matthew Effect in Science. The reward and communication systems of science are considered. Science. 1968;159(3810): 56–63.
15. Hicks D, Wouters P, Waltman L, De Rijcke S, Rafols I. Bibliometrics: the Leiden Manifesto for research metrics. Nature. 2015;520: 429–431. doi: 10.1038/520429a 25903611
16. Vernon MM, Balas EA, Momani S. Are university rankings useful to improve research? A systematic review. PloS one. 2018;13: e0193762. doi: 10.1371/journal.pone.0193762 29513762
17. Katz JS. The self-similar science system. Research policy. 1999;28: 501–517.
18. Nomaler Ö, Frenken K, Heimeriks G. On scaling of scientific knowledge production in US metropolitan areas. PloS one. 2014;9: e110805. doi: 10.1371/journal.pone.0110805 25353686
19. van Raan AF. Universities scale like cities. PloS one. 2013;8: e59384. doi: 10.1371/journal.pone.0059384 23544062
20. Waltman L, van Eck NJ. Field-normalized citation impact indicators and the choice of an appropriate counting method. Journal of Informetrics. 2015;9: 872–894.
21. Abramo G, D’Angelo CA. A farewell to the MNCS and like size-independent indicators. Journal of Informetrics. 2016;10: 646–651.
22. Brinkman PT, Leslie LL. Economies of Scale in Higher Education: Sixty Years of Research. Review of Higher Education. 1986;10: 1–28.
23. Daraio C, Bonaccorsi A, Simar L. Efficiency and economies of scale and specialization in European universities: A directional distance approach. Journal of Informetrics. 2015;9: 430–448.
24. [Anonymous]. DORA—San Francisco Declaration. https://sfdora.org.
25. National Center for Educational Statistics. Integrated Postsecondary Educational Data System (IPEDS). http://nces.ed.gov/ipeds/.
26. European Commission. European Tertiary Education Register (ETER). http://www.eter-project.com.
27. The Carnegie Foundation. Carnegie Classification of US universities. http://carnegieclassifications.iu.edu.
28. Waltman L, Calero‐Medina C, Kosten J, Noyons E, Tijssen RJ, Eck NJ, et al. The Leiden Ranking 2011/2012: Data collection, indicators, and interpretation. J Am Soc Inf Sci Technol. 2012;63: 2419–2432.
29. Waltman L, Calero‐Medina C, Kosten J, Noyons EC, Tijssen RJ, van Eck NJ, et al. The Leiden Ranking 2011/2012: Data collection, indicators, and interpretation. J Am Soc Inf Sci Technol. 2012;63: 2419–2432.
30. Leitao JC, Miotto JM, Gerlach M, Altmann EG. Is this scaling nonlinear? arXiv preprint arXiv:1604.02872. 2016.
31. Hansen CB. Generalized least squares inference in panel and multilevel models with serial correlation and fixed effects. J Econ. 2007;140: 670–694.
32. Koenker R, Hallock KF. Quantile regression. Journal of economic perspectives. 2001;15: 143–156.
33. MacKinnon D. Introduction to statistical mediation analysis. New York: Taylor & Francis; 2007.
34. Archambault É, Campbell D, Gingras Y, Larivière V. Comparing bibliometric statistics obtained from the Web of Science and Scopus. J Am Soc Inf Sci Technol. 2009;60: 1320–1326.
35. Bettencourt LM. The origins of scaling in cities. Science. 2013;340: 1438–1441. doi: 10.1126/science.1235823 23788793
36. Geiger RL. Research and relevant knowledge: American research universities since World War II. Oxford: Oxford University Press; 1993.
37. Cohen AM. The shaping of American higher education: Emergence and growth of the contemporary system: John Wiley & Sons; 2007.
38. Daraio C, Bonaccorsi A, Geuna A, Lepori B, Bach L, Bogetoft P, et al. The European university landscape. Research policy. 2011;40: 148–164. doi: 10.1016/j.respol.2010.10.009
39. Weerts DJ, Ronca JM. Understanding differences in state support for higher education across states, sectors, and institutions: A longitudinal study. The Journal of Higher Education. 2012;83: 155–185.
40. Peterson GJ, Presse S, Dill KA. Nonuniversal power law scaling in the probability distribution of scientific citations. Proc Natl Acad Sci U S A. 2010;107: 16023–16027. doi: 10.1073/pnas.1010757107 20805513
41. Hicks D. Performance-based university research funding systems. Research Policy. 2012;41: 251–261.
42. Slaughter S, Rhoades G. Academic capitalism and the new economy: Markets, state, and higher education: JHU Press; 2004.
43. Sauder M, Espeland WN. The discipline of rankings: tight coupling and organizational change. American Sociological Review. 2009;74: 63–82.
44. Deem R, Mok KH, Lucas L. Transforming higher education in whose image? Exploring the concept of the ‘world-class’ university in Europe and Asia. Higher education policy. 2008;21: 83–97.
45. Glänzel W, Thijs B, Debackere K. Productivity, performance, efficiency, impact-What do we measure anyway?. Some comments on the paper" A farewell to the MNCS and like size-independent indicators" by Abramo and D'Angelo. Journal of Informetrics. 2016.
46. Bonaccorsi A. Explaining poor performance of European science: institutions versus policies. Science and Public Policy. 2007;34: 303–316.
47. Labaree DF. Public schools for private gain: The declining American commitment to serving the public good. Phi Delta Kappan. 2018;100: 8–13.
48. Hazelkorn E, Gibson A. 18 The impact and influence of rankings on the quality, performance and accountability agenda. Research Handbook on Quality, Performance and Accountability in Higher Education. 2018: 232.
49. Gumport Patricia J. Academic Restructuring: Organizational Change and Institutional Imperatives. Higher Education. 2000;39: 67–91. 1003859026301.
50. Lange KL, Little RJ, Taylor JM. Robust statistical modeling using the t distribution. Journal of the American Statistical Association. 1989;84: 881–896.
51. Bartolucci F, Scaccia L. The use of mixtures for dealing with non-normal regression errors. Comput Stat Data Anal. 2005;48: 821–834.