skip to main content
research-article

Understanding workers, developing effective tasks, and enhancing marketplace dynamics: a study of a large crowdsourcing marketplace

Published:01 March 2017Publication History
Skip Abstract Section

Abstract

We conduct an experimental analysis of a dataset comprising over 27 million microtasks performed by over 70,000 workers issued to a large crowdsourcing marketplace between 2012--2016. Using this data---never before analyzed in an academic context---we shed light on three crucial aspects of crowdsourcing: (1) Task design---helping requesters understand what constitutes an effective task, and how to go about designing one; (2) Marketplace dynamics --- helping marketplace administrators and designers understand the interaction between tasks and workers, and the corresponding marketplace load; and (3) Worker behavior --- understanding worker attention spans, lifetimes, and general behavior, for the improvement of the crowdsourcing ecosystem as a whole.

References

  1. ClixSense. http://www.clixsense.com/.Google ScholarGoogle Scholar
  2. CrowdFlower. http://crowdflower.com.Google ScholarGoogle Scholar
  3. Mechanical Turk. http://www.mturk.com.Google ScholarGoogle Scholar
  4. NeoDev. http://www.neodev.se/.Google ScholarGoogle Scholar
  5. Prodege. http://www.prodege.com/.Google ScholarGoogle Scholar
  6. Samasource. http://samasource.com.Google ScholarGoogle Scholar
  7. Samasource Jobs. http://www.samasource.org/people/#jobs.Google ScholarGoogle Scholar
  8. Y. Amsterdamer et al. Crowd mining. In SIGMOD, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. A. Bozzon, M. Brambilla, and S. Ceri. Answering search queries with crowdsearcher. In WWW, pages 1009--1018, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. A. M. Brawley and C. L. Pury. Work experiences on mturk: Job satisfaction, turnover, and information sharing. CHB, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. R. Brewer et al. Why would anybody do this?: Understanding older adults' motivations and challenges in crowd work. In CHI, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. A. Das Sarma et al. Towards globally optimal crowdsourcing quality management: The uniform worker setting. In SIGMOD, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. S. B. Davidson, S. Khanna, T. Milo, and S. Roy. Using the crowd for top-k and group-by queries. In ICDT, pages 225--236, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. D. E. Difallah et al. The dynamics of micro-task crowdsourcing: The case of amazon mturk. In WWW, pages 238--247, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. http://www.behind-the-enemy-lines.com/2016/02/a-cohort-analysis-of-mechanical-turk.html. Cohort analysis of mturk requesters, blog post, 2015.Google ScholarGoogle Scholar
  16. http://www.forbes.com/sites/elainepofeldt/2015/05/05/elance-odesk-becomes-upwork-today-odesk-brand-gets-phased-out. Odesk brand gets phased out, forbes.com, 2015.Google ScholarGoogle Scholar
  17. M. J. Franklin, D. Kossmann, T. Kraska, S. Ramesh, and R. Xin. Crowddb: answering queries with crowdsourcing. In SIGMOD, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. M. L. Gray, S. Suri, S. S. Ali, and D. Kulkarni. The crowd is a collaborative network. In CSCW, pages 134--147. ACM, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. S. Guo et al. So who won?: dynamic max discovery with the crowd. In SIGMOD Conference, pages 385--396, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. https://www.odesk.com/oconomy/activity/. Odesk oconomy, 2012.Google ScholarGoogle Scholar
  21. https://www.upwork.com/blog/2013/12/mergerfaq/. Odesk-elance merger faq, upwork blog, 2014.Google ScholarGoogle Scholar
  22. P. G. Ipeirotis. Analyzing the amazon mechanical turk marketplace. XRDS, 17:16--21, December 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. P. G. Ipeirotis. Demographics of mechanical turk. 2010.Google ScholarGoogle Scholar
  24. A. Jain et al. Understanding workers, developing effective tasks, and enhancing marketplace dynamics. http://data-people.cs.illinois.edu/papers/crowd-data.pdf.Google ScholarGoogle Scholar
  25. A. Kittur, E. H. Chi, and B. Suh. Crowdsourcing user studies with mechanical turk. In CHI, pages 453--456, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. A. Kittur et al. The future of crowd work. In CSCW, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. X. Liu et al. Cdas: A crowdsourcing data analytics system. PVLDB, 5(10):1040--1051, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. A. Marcus et al. Crowdsourced databases: Query processing with people. In CIDR, pages 211--214, 2011.Google ScholarGoogle Scholar
  29. A. Marcus and A. Parameswaran. Crowdsourced data management: Industry and academic perspectives. Found. Trends databases, 6(1--2). Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. D. Martin et al. Being a turker. In CSCW. ACM, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. W. Mason and S. Suri. Conducting behavioral research on amazon's mechanical turk. Behavior research methods, 44(1):1--23, 2012.Google ScholarGoogle Scholar
  32. A. Parameswaran et al. Optimal crowd-powered rating and filtering algorithms. VLDB, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. H. Park et al. An overview of the deco system: Data model and query language; query processing and optimization. ACM SIGMOD Record, 41, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. A. Ramesh et al. Identifying reliable workers swiftly. Technical report, Stanford InfoLab, 2012.Google ScholarGoogle Scholar
  35. J. Ross et al. Who are the crowdworkers?: shifting demographics in mechanical turk. In CHI Extended Abstracts. ACM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. A. D. Sarma, A. Parameswaran, H. Garcia-Molina, and A. Halevy. Crowd-powered find algorithms. In ICDE, 2014.Google ScholarGoogle Scholar
  37. N. Stewart et al. The average laboratory samples a population of 7,300 amazon mechanical turk workers. JDM, 2015.Google ScholarGoogle Scholar
  38. P. Sun and K. T. Stolee. Exploring crowd consistency in a mechanical turk survey. In CSI-SE. ACM, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. S. Suri et al. Honesty in an online labor market. In Human Computation, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. N. Zukoff. Demographics of the Largest On-demand Workforce. http://www.crowdflower.com/blog/2014/01/demographics-of-the-largest-on-demand-workforce, 2014.Google ScholarGoogle Scholar
  41. K. Zyskowski et al. Accessible crowdwork?: Understanding the value in and challenge of microtask employment for people with disabilities. In CSCW, pages 1682--1693. ACM, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in

Full Access

  • Published in

    cover image Proceedings of the VLDB Endowment
    Proceedings of the VLDB Endowment  Volume 10, Issue 7
    March 2017
    132 pages
    ISSN:2150-8097
    Issue’s Table of Contents

    Publisher

    VLDB Endowment

    Publication History

    • Published: 1 March 2017
    Published in pvldb Volume 10, Issue 7

    Qualifiers

    • research-article

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader