skip to main content
10.1145/1148170.1148311acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
Article

One-sided measures for evaluating ranked retrieval effectiveness with spontaneous conversational speech

Published:06 August 2006Publication History

ABSTRACT

Early speech retrieval experiments focused on news broadcasts, for which adequate Automatic Speech Recognition (ASR) accuracy could be obtained. Like newspapers, news broadcasts are a manually selected and arranged set of stories. Evaluation designs reflected that, using known story boundaries as a basis for evaluation. Substantial advances in ASR accuracy now make it possible to build search systems for some types of spontaneous conversational speech, but present evaluation designs continue to rely on known topic boundaries that are no longer well matched to the nature of the materials. We propose a new class of measures for speech retrieval based on manual annotation of points at which a user with specific topical interests would wish replay to begin.

References

  1. Arons, B., SpeechSkimmer: A System for Interactively Skimming Recorded Speech, ACM TOCHI, 4(1)3--98, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Garofolo, J. et al., The TREC Spoken Document Retrieval Track: A Success Story, in TREC-8, 2000.Google ScholarGoogle Scholar
  3. Gustman, S. et al., Supporting Access to Large Digital Oral History Archives, in JCDL 2002, pp.18--27. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Kekäläinen, J. et al., Using Graded Relevance Assessments in IR evaluation, JASIST, 53(13), pp.1120--1129. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Voorhees, E., Variations in Relevance Judgments and the Measurement of Retrieval Effectiveness, in SIGIR 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. One-sided measures for evaluating ranked retrieval effectiveness with spontaneous conversational speech

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        SIGIR '06: Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
        August 2006
        768 pages
        ISBN:1595933697
        DOI:10.1145/1148170

        Copyright © 2006 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 6 August 2006

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • Article

        Acceptance Rates

        Overall Acceptance Rate792of3,983submissions,20%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader