In the past decade, research in Music Information Retrieval (MIR) has created a wealth of methods to extract latent musical information from the audio signal. While these methods are capable to infer acoustic similarities between music pieces, to reveal a song’s structure, or to identify a piece from a noisy recording, they cannot capture semantic information that is not encoded in the audio signal, but is nonetheless essential to many listeners. For instance, the meaning of a song’s lyrics, the background of a singer, or the work’s historical context cannot be derived without additional meta-data.

Such semantic information on music items, however, can be derived from other sources, including the web and social media, in particular services dedicated to the music domain. These sources typically offer a wide variety of multimedia data, including user-generated content, usage data, text, audio, video, and images. On the other hand, using the newly available sources of semantically meaningful information also poses new challenges. Among others, dealing with the massive amounts of information and the noisiness of this kind of data for example, introduced by various user biases, or injection of spurious information. This also calls for novel methods for user-centric evaluation of music retrieval systems.

Given the strengths and shortcomings inherent to both content-based and context-based approaches, hybrid methods that intelligently combine the two are essential. Such novel algorithms enable applications that capture musical aspects on a more comprehensive level than content or context-based approaches alone. Exploiting the full range of MIR technology, for instance, innovative user interfaces to access the large amounts of music available today (e.g., on tablets or smart mobile devices), or context-aware music recommendation systems are conceivable.

This Special Issue on “Hybrid Music Information Retrieval” highlights the newest developments in combining music content and context information. The five outstanding articles authored by excellent researchers in the field deal with multimodal music recommendation systems, investigation of relationships between visual and auditory signals, location-specific perception of music, multi-faceted cover song identification and retrieval, and reduction of human effort in evaluating music retrieval systems.

The submissions to this Special Issues were rigorously reviewed and the guest editors eventually selected three considerably extended versions of papers presented at the AdMIRe 2012: 4th International Workshop on Advances in Music Information Research: “The Web of Music” which was held in conjunction with the 21st International World Wide Web Conference in April 2012 in Lyon, France. From the other seven submissions, we selected two for inclusion in this Special Issue. All submissions underwent a review process which involved two review cycles. We are hence delighted to present the following most recent research that undoubtedly defines the state-of-the-art in hybrid MIR, a young and exciting new research field:

  1. [1]

    Marcos Aurélio Domingues, Fabien Gouyon, Alípio Mário Jorge, José Paulo Leal, João Vinagre, Luís Lemos, and Mohamed Sordo. Combining Usage and Content in an Online Recommendation System for Music in the Long-Tail

  2. [2]

    Cynthia C.S. Liem, Martha Larson, and Alan Hanjalic. When Music Makes a Scene—Characterizing Music in Multimedia Contexts via User Scene Descriptions

  3. [3]

    Matthias Braunhofer, Marius Kaminskas, and Francesco Ricci. Location-Aware Music Recommendation

  4. [4]

    Justin Salamon, Joan Serrà, and Emilia Gómez. Tonal Representations for Music Retrieval: From Version Identification to Query-by-Humming

  5. [5]

    Julián Urbano and Markus Schedl. Minimal Test Collections for Low-Cost Evaluation of Audio Music Similarity and Retrieval Systems

The editors want to express their gratitude to and acknowledge support from the Austrian Science Fund (FWF): P22856-N23, the European Commission: FP7 (Seventh Framework Programme); ICT-2011.1.5 Networked Media and Search Systems; grant agreement No 287711, and Gracenote.