Abstract
We conduct an experimental analysis of a dataset comprising over 27 million microtasks performed by over 70,000 workers issued to a large crowdsourcing marketplace between 2012--2016. Using this data---never before analyzed in an academic context---we shed light on three crucial aspects of crowdsourcing: (1) Task design---helping requesters understand what constitutes an effective task, and how to go about designing one; (2) Marketplace dynamics --- helping marketplace administrators and designers understand the interaction between tasks and workers, and the corresponding marketplace load; and (3) Worker behavior --- understanding worker attention spans, lifetimes, and general behavior, for the improvement of the crowdsourcing ecosystem as a whole.
- ClixSense. http://www.clixsense.com/.Google Scholar
- CrowdFlower. http://crowdflower.com.Google Scholar
- Mechanical Turk. http://www.mturk.com.Google Scholar
- NeoDev. http://www.neodev.se/.Google Scholar
- Prodege. http://www.prodege.com/.Google Scholar
- Samasource. http://samasource.com.Google Scholar
- Samasource Jobs. http://www.samasource.org/people/#jobs.Google Scholar
- Y. Amsterdamer et al. Crowd mining. In SIGMOD, 2013. Google ScholarDigital Library
- A. Bozzon, M. Brambilla, and S. Ceri. Answering search queries with crowdsearcher. In WWW, pages 1009--1018, 2012. Google ScholarDigital Library
- A. M. Brawley and C. L. Pury. Work experiences on mturk: Job satisfaction, turnover, and information sharing. CHB, 2016. Google ScholarDigital Library
- R. Brewer et al. Why would anybody do this?: Understanding older adults' motivations and challenges in crowd work. In CHI, 2016. Google ScholarDigital Library
- A. Das Sarma et al. Towards globally optimal crowdsourcing quality management: The uniform worker setting. In SIGMOD, 2016. Google ScholarDigital Library
- S. B. Davidson, S. Khanna, T. Milo, and S. Roy. Using the crowd for top-k and group-by queries. In ICDT, pages 225--236, 2013. Google ScholarDigital Library
- D. E. Difallah et al. The dynamics of micro-task crowdsourcing: The case of amazon mturk. In WWW, pages 238--247, 2015. Google ScholarDigital Library
- http://www.behind-the-enemy-lines.com/2016/02/a-cohort-analysis-of-mechanical-turk.html. Cohort analysis of mturk requesters, blog post, 2015.Google Scholar
- http://www.forbes.com/sites/elainepofeldt/2015/05/05/elance-odesk-becomes-upwork-today-odesk-brand-gets-phased-out. Odesk brand gets phased out, forbes.com, 2015.Google Scholar
- M. J. Franklin, D. Kossmann, T. Kraska, S. Ramesh, and R. Xin. Crowddb: answering queries with crowdsourcing. In SIGMOD, 2011. Google ScholarDigital Library
- M. L. Gray, S. Suri, S. S. Ali, and D. Kulkarni. The crowd is a collaborative network. In CSCW, pages 134--147. ACM, 2016. Google ScholarDigital Library
- S. Guo et al. So who won?: dynamic max discovery with the crowd. In SIGMOD Conference, pages 385--396, 2012. Google ScholarDigital Library
- https://www.odesk.com/oconomy/activity/. Odesk oconomy, 2012.Google Scholar
- https://www.upwork.com/blog/2013/12/mergerfaq/. Odesk-elance merger faq, upwork blog, 2014.Google Scholar
- P. G. Ipeirotis. Analyzing the amazon mechanical turk marketplace. XRDS, 17:16--21, December 2010. Google ScholarDigital Library
- P. G. Ipeirotis. Demographics of mechanical turk. 2010.Google Scholar
- A. Jain et al. Understanding workers, developing effective tasks, and enhancing marketplace dynamics. http://data-people.cs.illinois.edu/papers/crowd-data.pdf.Google Scholar
- A. Kittur, E. H. Chi, and B. Suh. Crowdsourcing user studies with mechanical turk. In CHI, pages 453--456, 2008. Google ScholarDigital Library
- A. Kittur et al. The future of crowd work. In CSCW, 2013. Google ScholarDigital Library
- X. Liu et al. Cdas: A crowdsourcing data analytics system. PVLDB, 5(10):1040--1051, 2012. Google ScholarDigital Library
- A. Marcus et al. Crowdsourced databases: Query processing with people. In CIDR, pages 211--214, 2011.Google Scholar
- A. Marcus and A. Parameswaran. Crowdsourced data management: Industry and academic perspectives. Found. Trends databases, 6(1--2). Google ScholarDigital Library
- D. Martin et al. Being a turker. In CSCW. ACM, 2014. Google ScholarDigital Library
- W. Mason and S. Suri. Conducting behavioral research on amazon's mechanical turk. Behavior research methods, 44(1):1--23, 2012.Google Scholar
- A. Parameswaran et al. Optimal crowd-powered rating and filtering algorithms. VLDB, 2014. Google ScholarDigital Library
- H. Park et al. An overview of the deco system: Data model and query language; query processing and optimization. ACM SIGMOD Record, 41, 2012. Google ScholarDigital Library
- A. Ramesh et al. Identifying reliable workers swiftly. Technical report, Stanford InfoLab, 2012.Google Scholar
- J. Ross et al. Who are the crowdworkers?: shifting demographics in mechanical turk. In CHI Extended Abstracts. ACM, 2010. Google ScholarDigital Library
- A. D. Sarma, A. Parameswaran, H. Garcia-Molina, and A. Halevy. Crowd-powered find algorithms. In ICDE, 2014.Google Scholar
- N. Stewart et al. The average laboratory samples a population of 7,300 amazon mechanical turk workers. JDM, 2015.Google Scholar
- P. Sun and K. T. Stolee. Exploring crowd consistency in a mechanical turk survey. In CSI-SE. ACM, 2016. Google ScholarDigital Library
- S. Suri et al. Honesty in an online labor market. In Human Computation, 2011. Google ScholarDigital Library
- N. Zukoff. Demographics of the Largest On-demand Workforce. http://www.crowdflower.com/blog/2014/01/demographics-of-the-largest-on-demand-workforce, 2014.Google Scholar
- K. Zyskowski et al. Accessible crowdwork?: Understanding the value in and challenge of microtask employment for people with disabilities. In CSCW, pages 1682--1693. ACM, 2015. Google ScholarDigital Library
Recommendations
Modus Operandi of Crowd Workers: The Invisible Role of Microtask Work Environments
The ubiquity of the Internet and the widespread proliferation of electronic devices has resulted in flourishing microtask crowdsourcing marketplaces, such as Amazon MTurk. An aspect that has remained largely invisible in microtask crowdsourcing is that ...
Allocating tasks to workers with matching constraints: truthful mechanisms for crowdsourcing markets
WWW '14 Companion: Proceedings of the 23rd International Conference on World Wide WebDesigning optimal pricing policies and mechanisms for allocating tasks to workers is central to the online crowdsourcing markets. In this paper, we consider the following realistic setting of online crowdsourcing markets -- there is a requester with a ...
The Daemo Crowdsourcing Marketplace
CSCW '17 Companion: Companion of the 2017 ACM Conference on Computer Supported Cooperative Work and Social ComputingThe success of crowdsourcing markets is dependent on a strong foundation of trust between workers and requesters. In current marketplaces, workers and requesters are often unable to trust each other's quality, and their mental models of tasks are ...
Comments