skip to main content
10.1145/2348283.2348314acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
research-article

Personalization of search results using interaction behaviors in search sessions

Published:12 August 2012Publication History

ABSTRACT

Personalization of search results offers the potential for significant improvement in information retrieval performance. User interactions with the system and documents during information-seeking sessions provide a wealth of information about user preferences and their task goals. In this paper, we propose methods for analyzing and modeling user search behavior in search sessions to predict document usefulness and then using information to personalize search results. We generate prediction models of document usefulness from behavior data collected in a controlled lab experiment with 32 participants, each completing uncontrolled searching for 4 tasks in the Web. The generated models are then tested with another data set of user search sessions in radically different search tasks and constrains. The documents predicted useful and not useful by the models are used to modify the queries in each search session using a standard relevance feedback technique. The results show that application of the models led to consistently improved performance over a baseline that did not take account of user interaction information. These findings have implications for designing systems for personalized search and improving user search experience.

References

  1. Agichtein, E., Brill, E., Dumais, S., & Ragno, R. (2006). Learning user interaction models for predicting web search result preferences. In Proceedings of the 29th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval. (pp. 3--10). Seattle, Washington, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Belkin, N. J., Carballo, J. P., Cool, C., Lin, S., Park, S. Y., Rieh, S. Y., et al. (1998). Rutgers' TREC-6 interactive track experience. Proceedings of the Sixth Text REtrieval Conference, 597--610.Google ScholarGoogle Scholar
  3. Bierig, R., Cole., M.J., & Gwizdka, J. (2009). A user centered experiment and logging framework for interactive information retrieval. In N.J. Belkin, R. Bierig, G. Buscher, L. van Elst, J. Gwizdka, J. Jose, et al. (Eds), CEUR Workshop Proceedings: 512. Proceedings of the SIGIR 2009 Workshop on Understanding the User: Logging and interpreting user interactions in information search and retrieval, UIIR'2009 (pp. 8--11). Aachen, Germany: CEUR Workshop Proceedings.Google ScholarGoogle Scholar
  4. Borlund, P. (2003). The IIR evaluation model: A framework for evaluation of interactive information retrieval systems. Information Research, 8(3), paper no. 152. Retrieved from http://informationr.net/ir/8--3/paper152.html.Google ScholarGoogle Scholar
  5. Breiman, L. (1984). Classification and Regression Trees. Boca Raton: Chapman & Hall/CRC.Google ScholarGoogle Scholar
  6. Chapelle, O., Metzler, D., Zhang, Y., and Grinspan, P. (2009). Expected reciprocal rank for graded relevance. In Proceedings of the 18th ACM Conference on Information and Knowledge Management (CIKM), 2009, 621--630. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Fox, S., Karnawat, K., Mydland, M., Dumais, S., & White, T. (2005). Evaluating implicit measures to improve web search. ACM Transactions on Information Systems, 23(2), 147--168. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Kanoulas, E., Hall, M. Clough, P. Carterette, B., & Sanderson, M. (2012) Overview of the TREC Session Track. In: The Twentieth Text REtrieval Conference Proceedings (TREC 2011). Gaithersburg, MD: National Institute of Standards and Technology. Retrieved on 20 May 2012 at http://trec.nist.gov/pubs/trec20/t20.proceedings.html.Google ScholarGoogle Scholar
  9. Kelly, D. (2005). Implicit feedback: Using behavior to infer relevance. In A. Spink and C. Cole (Eds.) New Directions in Cognitive Information Retrieval (pp.169--186). Netherlands: Springer Publishing.Google ScholarGoogle Scholar
  10. Kelly, D. & Belkin, N.J. (2004). Display time as implicit feedback: Understanding task effects. In Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval (SIGIR '04). ACM, New York, NY, USA, 377--384. DOI= http://doi.acm.org/10.1145/1008992.1009057 Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Kelly, D. & Teevan, J. (2003). Implicit feedback for inferring user preference: A bibliography. SIGIR Forum, 37(2), 18--28. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Li, Y. & Belkin, N.J. (2008). A faceted approach to conceptualizing tasks in information seeking. Inf. Process. Manage. 44, 6 (November 2008), 1822--1837. DOI=10.1016/j.ipm.2008.07.005 http://dx.doi.org/10.1016/j.ipm.2008.07.005 Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Liu, C., Gwizdka, J., & Liu, J. (2010). Helping identify when users find useful documents: examination of query reformulation intervals. In Proceeding of the third symposium on Information interaction in context (IIiX '10). ACM, New York, NY, USA, 215--224. DOI= http://doi.acm.org/10.1145/1840784.1840816 Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Liu, J. & Belkin, N.J. (2010). Personalizing information retrieval for multi-session tasks: The roles of task stage and task type. In Proceedings of the 33rd Annual International ACM SIGIR Conference on Research & Development on Information Retrieval (SIGIR '10). Geneva, Switzland, July 19--23, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Loper, E. and Bird, S. (2002). NLTK: The Natural Language Toolkit. Proceedings of the ACL02 Workshop on Effective tools and methodologies for teaching natural language processing and computational linguistics, Volume 1, July, 2002 Association for Computational Linguistics. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Teevan, J., Dumais, S.T., and Horvitz, E. (2010). Potential for personalization. ACM Trans. Comput.-Hum. Interact. 17, 1, 1--31. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. White, R. W. & Kelly, D. (2006). A study on the effects of personalization and task information on implicit feedback performance. In Proceedings of the 15th ACM international conference on Information and knowledge management (pp. 297--306). Arlington, Virginia, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. White, R.W., Ruthven, I., and Jose, J.M. (2005). A study of factors affecting the utility of implicit relevance feedback. In Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval (SIGIR '05). ACM, New York, NY, USA, 35--42. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Personalization of search results using interaction behaviors in search sessions

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      SIGIR '12: Proceedings of the 35th international ACM SIGIR conference on Research and development in information retrieval
      August 2012
      1236 pages
      ISBN:9781450314725
      DOI:10.1145/2348283

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 12 August 2012

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate792of3,983submissions,20%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader