ABSTRACT
The phenomenal success of certain crowdsourced online platforms, such as Wikipedia, is accredited to their ability to tap the crowd's potential to collaboratively build knowledge. While it is well known that the crowd's collective wisdom surpasses the cumulative individual expertise, little is understood on the dynamics of knowledge building in a crowdsourced environment. A proper understanding of the dynamics of knowledge building in a crowdsourced environment would enable one in the better designing of such environments to solicit knowledge from the crowd. Our experiment on crowdsourced systems based on annotations shows that an important reason for the rapid knowledge building in such environments is due to variance in expertise. We use, as our test bed, a customized Crowdsourced Annotation System (CAS) which provides a group of users the facility to annotate a given document while trying to understand it. Our results show the presence of different genres of proficiency amongst the users of an annotation system. We observe that the crowdsourced knowledge ecosystem comprises of mainly four categories of contributors, namely: Probers, Solvers, Articulators and Explorers. We infer from our experiment that the knowledge garnering mainly happens due to the synergetic interaction across these categories.
- E. Estelles-Arolas and F. Gonzalez-Ladron-de Guevara, "Towards an integrated crowdsourcing definition," Journal of Information Science, vol. 38, no. 2, pp. 189--200, Mar. 2012. {Online}. Available: http://jis.sagepub.com/cgi/doi/10.1177/0165551512437638 Google ScholarDigital Library
- Y. Beldarrain, "Distance Education Trends: Integrating new technologies to foster student interaction and collaboration," Distance Education, vol. 27, no. 2, pp. 139--153, Aug. 2006. {Online}. Available: http://www.tandfonline.com/doi/abs/10.1080/01587910600789498Google ScholarCross Ref
- T. Bryant, "Social software in academia," Educause quarterly, no. 2, pp. 61--64, 2006. {Online}. Available: http://net.educause.edu/ir/library/pdf/EQM0627.pdfGoogle Scholar
- N. T. Korfiatis, M. Poulos, and G. Bokos, "Evaluating authoritative sources using social networks: an insight from Wikipedia," Online Information Review, vol. 30, no. 3, pp. 252--262, May 2006. {Online}. Available: http://www.emeraldinsight.com/doi/abs/10.1108/14684520610675780Google ScholarCross Ref
- C. Pentzold and S. Seidenglanz, "Foucault@ Wiki: first steps towards a conceptual framework for the analysis of Wiki discourses," . . . of the 2006 international symposium on . . ., pp. 59--68, 2006. {Online}. Available: http://dl.acm.org/citation.cfm?id=1149468 Google ScholarDigital Library
- C. Wagner and H. Kong, "Breaking the Knowledge Acquisition Bottleneck Through Conversational," Information Resources Management Journal (IRMJ), vol. 19, no. March, pp. 70--83, 2006. Google ScholarDigital Library
- J. Giles, "Internet encyclopaedias go head to head." Nature, vol. 438, no. 7070, pp. 900--901, Dec. 2005. {Online}. Available: http://www.ncbi.nlm.nih.gov/pubmed/16355180 http://www.nature.com/nature/journal/v438/n7070/full/438900a.htmlGoogle ScholarCross Ref
- J. Howe, "The Rise of Crowdsourcing," North, vol. 14, pp. 1--5, 2006.Google Scholar
- S. Zyto, D. Karger, M. Ackerman, and S. Mahajan, "Successful classroom deployment of a social document annotation system," Proceedings of the SIGCHI . . ., pp. 1883--1892, 2012. {Online}. Available: http://dl.acm.org/citation.cfm?id=2208326 Google ScholarDigital Library
- M. Thomas, "Learning within incoherent structures: the space of online discussion forums," Journal of Computer Assisted Learning, vol. 18, no. 3, pp. 351--366, Dec. 2002. {Online}. Available: http://doi.wiley.com/10.1046/j.0266-4909.2002.03800.xGoogle ScholarCross Ref
- M. Scardamalia and C. Bereiter, "Computer support for knowledge-building communities," The journal of the learning sciences, vol. 3, no. 3, pp. 265--283, 1994.Google Scholar
- G. Stahl, "A model of collaborative knowledge-building," Fourth international conference of the learning . . ., pp. 70--77, 2000.Google Scholar
- U. Cress and J. Kimmerle, "A systemic and cognitive view on collaborative knowledge building with wikis," International Journal of Computer-Supported Collaborative Learning, vol. 3, no. 2, pp. 105--122, Jan. 2008. {Online}. Available: http://link.springer.com/10.1007/s11412-007-9035-zGoogle ScholarCross Ref
- J. Kimmerle, J. Moskaliuk, and U. Cress, "Using Wikis for Learning and Knowledge Building: Results of an Experimental Study." Educational Technology & . . ., vol. 14, pp. 138--148, 2011.Google Scholar
- I. Nonaka, "A dynamic theory of organizational knowledge creation," Organization science, vol. 5, no. 1, pp. 14--37, 1994. Google ScholarDigital Library
- J. Zhang, M. Scardamalia, M. Lamon, R. Messina, and R. Reeve, "Sociocognitive dynamics of knowledge building in the work of 9- and 10-year-old," Educational Technology Research and Development, vol. 2, no. 55, pp. 117--145, 2007.Google ScholarCross Ref
- O. Nov, "What motivates wikipedians?" Communications of the ACM, vol. 50, no. 11, pp. 60--64, 2007. {Online}. Available: http://dl.acm.org/citation.cfm?id=1297798 Google ScholarDigital Library
- A. Forte and A. Bruckman, "Why do people write for Wikipedia? Incentives to contribute to open--content publishing," in Proceedings of 41st Annual Hawaii International Conference on System Sciences (HICSS), 2008, pp. 1--11.Google Scholar
- C. E. Hmelo-Silver and H. S. Barrows, "Facilitating Collaborative Knowledge Building," Cognition and Instruction, vol. 26, no. 1, pp. 48--94, Jan. 2008. {Online}. Available: http://www.tandfonline.com/doi/abs/10.1080/07370000701798495Google ScholarCross Ref
- S. Vonderwell and S. Zachariah, "Factors that influence participation in online learning," Journal of Research on Technology in . . ., vol. 5191, pp. 213--230, 2005. {Online}. Available: http://www.tandfonline.com/doi/abs/10.1080/15391523.2005.10782457Google Scholar
- B. B. Bederson, "Interfaces for staying in the flow," Ubiquity, vol. 2004, no. September, p. 1, Sep. 2004. {Online}. Available: http://dl.acm.org/ft_gateway.cfm?id=1074069&type=html Google ScholarDigital Library
- M. Guzdial and J. Turns, "Effective discussion through a computer-mediated anchored forum," The journal of the learning sciences, vol. 9, no. 4, pp. 437--469, 2000. {Online}. Available: http://www.tandfonline.com/doi/abs/10.1207/s15327809jls0904 3Google Scholar
- J. J. Cadiz, A. Gupta, and J. Grudin, "Using Web Annotations for Asynchronous Collaboration Around Documents Using Web Annotations for Asynchronous Collaboration Around Documents," In Proceedings of the 2000 ACM conference on Computer supported cooperative work, pp. 309--318, 2000. Google ScholarDigital Library
- C. C. Marshall and A. J. B. Brush, "From personal to shared annotations," CHI '02 extended abstracts on Human factors in computing systems, pp. 812--813, 2002. Google ScholarDigital Library
- A. Y. Su, S. J. Yang, W.-Y. Hwang, and J. Zhang, "A Web 2.0-based collaborative annotation system for enhancing knowledge sharing in collaborative learning environments," Computers & Education, vol. 55, no. 2, pp. 752--766, Sep. 2010. {Online}. Available: http://linkinghub.elsevier.com/retrieve/pii/S0360131510000886 Google ScholarDigital Library
- F. D. Davis, "Perceived usefulness, perceived ease of use, and user acceptance of information technology," MIS quarterly, pp. 319--340, 1989. Google ScholarDigital Library
- A. Chhabra, S. Iyengar, P. Saini, and R. S. Bhat, "A framework for textbook enhancement and learning using crowd-sourced annotations," arXiv preprint arXiv:1503.06009, 2015.Google Scholar
- N. Luhmann, Social systems. Stanford University Press, 1995.Google Scholar
- Presence of an Ecosystem: a Catalyst in the Knowledge Building Process in Crowdsourced Annotation Environments
Recommendations
Measuring the Effectiveness of Gamesourcing Expert Oil Painting Annotations
ECIR 2014: Proceedings of the 36th European Conference on IR Research on Advances in Information Retrieval - Volume 8416Tasks that require users to have expert knowledge are difficult to crowdsource. They are mostly too complex to be carried out by non-experts and the available experts in the crowd are difficult to target. Adapting an expert task into a non-expert user ...
Modus Operandi of Crowd Workers: The Invisible Role of Microtask Work Environments
The ubiquity of the Internet and the widespread proliferation of electronic devices has resulted in flourishing microtask crowdsourcing marketplaces, such as Amazon MTurk. An aspect that has remained largely invisible in microtask crowdsourcing is that ...
A Community Rather Than A Union: Understanding Self-Organization Phenomenon on MTurk and How It Impacts Turkers and Requesters
CHI EA '17: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing SystemsThis paper aims to understand the self-organization phenomenon among the workers of Amazon Mechanical Turk (MTurk), a well-known crowdsourcing platform. Specifically, we explored 1) why MTurk workers self-organize into online communities (Turker ...
Comments