To read this content please select one of the options below:

Measuring performance of metasearch engines to access information: an exploratory study based on precision metrics

Raj Kumar Bhardwaj (Library, St Stephen’s College, University of Delhi, Delhi, India)
Ritesh Kumar (Department of Library and Information Science, Bharathidasan University, Tiruchirappalli, India)
Mohammad Nazim (Department of Library and Information Science, Aligarh Muslim University, Aligarh, India)

Performance Measurement and Metrics

ISSN: 1467-8047

Article publication date: 18 March 2024

Issue publication date: 17 April 2024

43

Abstract

Purpose

This paper evaluates the precision of four metasearch engines (MSEs) – DuckDuckGo, Dogpile, Metacrawler and Startpage, to determine which metasearch engine exhibits the highest level of precision and to identify the metasearch engine that is most likely to return the most relevant search results.

Design/methodology/approach

The research is divided into two parts: the first phase involves four queries categorized into two segments (4-Q-2-S), while the second phase includes six queries divided into three segments (6-Q-3-S). These queries vary in complexity, falling into three types: simple, phrase and complex. The precision, average precision and the presence of duplicates across all the evaluated metasearch engines are determined.

Findings

The study clearly demonstrated that Startpage returned the most relevant results and achieved the highest precision (0.98) among the four MSEs. Conversely, DuckDuckGo exhibited consistent performance across both phases of the study.

Research limitations/implications

The study only evaluated four metasearch engines, which may not be representative of all available metasearch engines. Additionally, a limited number of queries were used, which may not be sufficient to generalize the findings to all types of queries.

Practical implications

The findings of this study can be valuable for accreditation agencies in managing duplicates, improving their search capabilities and obtaining more relevant and precise results. These findings can also assist users in selecting the best metasearch engine based on precision rather than interface.

Originality/value

The study is the first of its kind which evaluates the four metasearch engines. No similar study has been conducted in the past to measure the performance of metasearch engines.

Keywords

Citation

Bhardwaj, R.K., Kumar, R. and Nazim, M. (2024), "Measuring performance of metasearch engines to access information: an exploratory study based on precision metrics", Performance Measurement and Metrics, Vol. 25 No. 1, pp. 23-42. https://doi.org/10.1108/PMM-09-2023-0028

Publisher

:

Emerald Publishing Limited

Copyright © 2024, Emerald Publishing Limited

Related articles