Skip to main content

Improving MEDLINE Document Retrieval Using Automatic Query Expansion

  • Conference paper
Asian Digital Libraries. Looking Back 10 Years and Forging New Frontiers (ICADL 2007)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 4822))

Included in the following conference series:

Abstract

In this study, we performed a comprehensive evaluation of pseudo-relevance feedback technique for automatic query expansion using OHSUMED test collection. The well-known term sorting methods for the selection of expansion terms were tested in our experiments. We also proposed a new term reweighting method for further performance improvements. Through the multiple sets of test, we suggested that local context analysis was probably the most effective method of selecting good expansion terms from a set of MEDLINE documents given enough feedback documents. Both term sorting and term reweighting method might need to be carefully considered to achieve maximum performance improvements.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. White, R.W.: Implicit feedback for interactive information retrieval. In: SIGIR Forum 2005, p. 70 (2005)

    Google Scholar 

  2. Fan, W., Luo, M., Wang, L., Xi, W., Fox, E.A.: Tuning before feedback: combining ranking discovery and blind feedback for robust retrieval. In: 27th annual international ACM SIGIR conference on Research and development in information retrieval, pp. 138–145. ACM Press, New York (2004)

    Google Scholar 

  3. Jimmny, L., Murray, G.C.: Assessing the term independence assumption in blind relevance feedback. In: 28th annual international ACM SIGIR conference on Research and development in information retrieval, pp. 635–636. ACM Press, New York (2005)

    Google Scholar 

  4. Harman, D.: Relevance feedback revisited. In: 15th annual international ACM SIGIR conference on Research and development in information retrieval, pp. 1–10. ACM Press, New York (1992)

    Chapter  Google Scholar 

  5. Carpineto, C., Mori, R., Romano, G., Bigi, B.: An Information-Theoretic Approach to Automatic Query Expansion. ACM Trans. Inf. Syst. 19, 1–27 (2001)

    Article  Google Scholar 

  6. Hersh, W., Buckley, C., Leone, T.J., Hickam, D.: OHSUMED: an interactive retrieval evaluation and new large test collection for research. In: 17th annual international ACM SIGIR conference on Research and development in information retrieval, pp. 192–201. Springer, Heidelberg (1994)

    Google Scholar 

  7. Lovins, J.B.: Development of a stemming algorithm. Mechanical Translation and Computational Linguistics. 11, 22–31 (1968)

    Google Scholar 

  8. Robertson, S.E., Walker, S.: Okapi/Keenbow at TREC-8. In: 8th Text REtrieval Conference (TREC-8), pp. 151–161 (1999)

    Google Scholar 

  9. Robertson, S.E.: On relevance weight estimation and query expansion. Journal of Documentation 42, 182–188 (1986)

    Article  Google Scholar 

  10. Efthimiadis, E.N., Brion, P.V.: UCLA-Okapi at TREC-2: Query Expansion Experiments. In: 2nd Text REtrieval Conference (TREC.2), pp. 200–215, NIST Special Publication (1994)

    Google Scholar 

  11. Xu, J., Croft, W.B.: Improving the effectiveness of information retrieval with local context analysis. ACM Trans. Inf. Syst. 18, 79–112 (2000)

    Article  Google Scholar 

  12. Carpineto, C., Romano, G.: Improving retrieval feedback with multiple term-ranking function combination. ACM Trans. Inf. Syst. 20, 259–290 (2002)

    Article  Google Scholar 

  13. Anh, V.N., Moffat, A.: Simplified similarity scoring using term ranks. In: 28th annual international ACM SIGIR conference on Research and development in information retrieval, pp. 226–233. ACM Press, New York (2005)

    Chapter  Google Scholar 

  14. JH, L.: Analyses of multiple evidence combination. In: 20th annual international ACM SIGIR conference on Research and development in information retrieval, pp. 267–276. ACM Press, New York (1997)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Dion Hoe-Lian Goh Tru Hoang Cao Ingeborg Torvik Sølvberg Edie Rasmussen

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Yoo, S., Choi, J. (2007). Improving MEDLINE Document Retrieval Using Automatic Query Expansion. In: Goh, D.HL., Cao, T.H., Sølvberg, I.T., Rasmussen, E. (eds) Asian Digital Libraries. Looking Back 10 Years and Forging New Frontiers. ICADL 2007. Lecture Notes in Computer Science, vol 4822. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-77094-7_33

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-77094-7_33

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-77093-0

  • Online ISBN: 978-3-540-77094-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics