Skip to main content

Evaluation with Respect to Usefulness

Some Perspectives from Industry

  • Chapter
Book cover Bridging Between Information Retrieval and Databases (PROMISE 2013)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 8173))

Included in the following conference series:

  • 893 Accesses

Abstract

Commercial web search engines are used by millions of users across the globe on a daily basis to assist their information needs. A user enters a query in the search box and expects the search engine to return relevant search results. Evaluating the quality of search results is a very important aspect in the development and maintenance of those systems. In this paper we describe some of the current approaches for assessing quality in an industrial setting.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Alonso, O., Mizzaro, S.: Using Crowdsourcing for TREC Relevance Assessment. Information Processing and Management 48(6), 1053–1066 (2012)

    Article  Google Scholar 

  2. Artstein, R., Poesio, M.: Inter-coder Agreement for Computational Linguistics. Journal of Computational Linguistics 34(4), 555–596 (2008)

    Article  Google Scholar 

  3. Chapelle, O., Joachims, T., Radlinski, F., Yue, Y.: Large-scale Validation and Analysis of Interleaved Search Evaluation. ACM Trans. Inf. Syst. 30(1), 6 (2012)

    Article  Google Scholar 

  4. Fernquist, J., Chi, E.: Perception and Understanding of Social Annotations in Web Search. In: Proc. of WWW (2013)

    Google Scholar 

  5. Harman, D.: Information Retrieval Evaluation. Morgan & Claypool (2011)

    Google Scholar 

  6. Hearst, M.: Search User Interfaces. Cambridge University Press (2009)

    Google Scholar 

  7. Joachims, J.: Evaluating Retrieval Performance Using Clickthrough Data. In: Text Mining, pp. 79–96 (2003)

    Google Scholar 

  8. Kelly, D.: Methods for Evaluating Interactive Information Retrieval Systems With Users. Foundations and Trends in Information Retrieval 3(1-2), 1–224 (2009)

    Google Scholar 

  9. Lease, M., Alonso, O.: Crowdsourcing for Search Evaluation and Social-algorithmic Search. In: Proc. of SIGIR (2012)

    Google Scholar 

  10. Lease, M., Yilmaz, E.: Crowdsourcing for Information Retrieval: Introduction to the Special Issue. Information Retrieval 16(2) (2013)

    Google Scholar 

  11. Muralidharan, A., Gyngyi, Z., Chi, E.: Social Annotations in Web Search. In: Proc. of CHI (2012)

    Google Scholar 

  12. Pantel, P., Gamon, M., Alonso, O., Haas, K.: Social Annotations: Utility and Prediction Modeling. In: Proc. of SIGIR (2012)

    Google Scholar 

  13. Radlinski, F., Hofmann, K.: Practical Online Retrieval Evaluation. In: Serdyukov, P., Braslavski, P., Kuznetsov, S.O., Kamps, J., Rüger, S., Agichtein, E., Segalovich, I., Yilmaz, E. (eds.) ECIR 2013. LNCS, vol. 7814, pp. 878–881. Springer, Heidelberg (2013)

    Chapter  Google Scholar 

  14. Smucker, M., Kazai, G., Lease, M.: Overview of the TREC 2012 Crowdsourcing Track. In: TREC 2012 (2012)

    Google Scholar 

  15. Voorhees, E., Harman, D.: TREC Experiment and Evaluation in Information Retrieval. MIT Press (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Alonso, O. (2014). Evaluation with Respect to Usefulness. In: Ferro, N. (eds) Bridging Between Information Retrieval and Databases. PROMISE 2013. Lecture Notes in Computer Science, vol 8173. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-54798-0_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-54798-0_8

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-54797-3

  • Online ISBN: 978-3-642-54798-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics