Trade-offs in motivating volunteer effort: Experimental evidence on voluntary contributions to science


Autoři: Elizabeth Lyons aff001;  Laurina Zhang aff002
Působiště autorů: School of Global Policy & Strategy, University of California, San Diego, La Jolla, CA, United States of America aff001;  Scheller College of Business, Georgia Institute of Technology 800 W Peachtree St NW, Atlanta, GA, United States of America aff002
Vyšlo v časopise: PLoS ONE 14(11)
Kategorie: Research Article
doi: 10.1371/journal.pone.0224946

Souhrn

Digitization has facilitated the proliferation of crowd science by lowering the cost of finding individuals with the willingness to participate in science without pay. However, the factors that influence participation and the outcomes of voluntary participation are unclear. We report two findings from a field experiment on the world’s largest crowd science platform that tests how voluntary contributions to science are affected by providing clarifying information on either the desired outcome of a scientific task or the labor requirements for completing the task. First, there is significant heterogeneity in the motivations and ability of contributors to crowd science. Second, both of the information interventions lead to significant decreases in the quantity and increases in the quality of contributions. Combined, our findings are consistent with the information interventions improving match quality between the task and the volunteer. Our findings suggest that science can be democratized by engaging individuals with varying skill levels and motivations with small changes in the information provided to participants.

Klíčová slova:

Astronomy – Attention – Grasses – Motivation – Research validity – Scientists – Shrubs – Surveys


Zdroje

1. Brown HR, Zeidman P, Smittenaar P, Adams RA, McNab F, Rutledge RB, et al. Crowdsourcing for cognitive science—The utility of smartphones. PLOS One. 2014;9.

2. Bagla P. Crowd-Sourcing Drug Discovery. Science. 2012;335(6071):909. doi: 10.1126/science.335.6071.909 22362985

3. Sauermann H, Franzoni C. Crowd science user contribution patterns and their implications. PNAS. 2014;112(3):679–684. doi: 10.1073/pnas.1408907112

4. Andreoni J, Gale WG, Scholz JK. Charitable contributions of time and money. 1996;.

5. Cantoni D, Yang DY, Yuchtman N, Zhang YJ. Protests as Strategic Games: Experimental Evidence from Hong Kong’s Anti-Authoritarian Movement. The Quarterly Journal of Economics. 2019;134(2).

6. Besley T, Ghatak M. Competition and incentives with motivated agents. The American Economic Review. 2005;95(3):616–636. doi: 10.1257/0002828054201413

7. Shenhav A, Musslick S, Lieder F, Kool W, Griffiths TL, Cohen JD, et al. Toward a rational and mechanistic account of mental effort. Annual review of neuroscience. 2017;40:99–124. doi: 10.1146/annurev-neuro-072116-031526 28375769

8. Levitt SD, List JA. Was there really a Hawthorne effect at the Hawthorne plant? An analysis of the original illumination experiments. American Economic Journal: Applied Economics. 2011;3(1):224–238.

9. Knowles S, Servátka M. Transaction costs, the opportunity cost of time and procrastination in charitable giving. Journal of public economics. 2015;125:54–63. doi: 10.1016/j.jpubeco.2015.03.001

10. Brown AL, Meer J, Williams JF. Why do people volunteer? An experimental analysis of preferences for time donations. Management Science. 2018;.

11. Lyons E, Zhang L. Why do people volunteer? An experimental analysis of preferences for time donations. UC San Diego Library Digital Collections, 2019. https://doi.org/10.6075/J0N58JRK.

12. Exley CL. Using charity performance metrics as an excuse not to give. Working paper; 2018.

13. Cassar L, Meier S. Non-Monetary Incentives and the Quest for Work Meaning. Working paper; 2016.

14. Burbano VC. Getting Gig Workers to Do More by Doing Good: Field Experimental Evidence from Online Platform Labor Marketplaces. Working Paper. 2017;.

15. Hedblom D, Hickman B, List J. Toward an Understanding of Corporate Social Responsibility: Theory and Field Experimental Evidence. NBER Working Paper 26222. 2019;.

16. Petrongolo B, Pissarides CA. Looking into the black box: A survey of the matching function. Journal of Economic literature. 2001;39(2):390–431. doi: 10.1257/jel.39.2.390

17. Holmstrom B, Milgrom P. Multitask Principal-Agent Analyses: Incentive Contracts, Asset Ownership, and Job Design. Journal of Law, Economics & Organization. 1991;7:24–52. doi: 10.1093/jleo/7.special_issue.24

18. Kelling S, Fink D, La Sorte FA, Johnston A, Bruns NE, Hochachka WM. Taking a ‘Big Data’ approach to data quality in a citizen science project. Ambio. 2015;44(4):601–611. doi: 10.1007/s13280-015-0710-4 26508347

19. Boudreau KJ, Lacetera N, Lakhani KR. Incentives and Problem Uncertainty in Innovation Contests: An Empirical Analysis. Management Science. 2011;57(5):843–863. doi: 10.1287/mnsc.1110.1322

20. Lindley D. Managing data. Communications of the ACM. 2009;52(10):11–13. doi: 10.1145/1562764.1562771

21. Marincola E. Why is public science education important? Journal of translational medicine. 2006;4(1):7. doi: 10.1186/1479-5876-4-7 16433911


Článek vyšel v časopise

PLOS One


2019 Číslo 11