Information Retrieval (IR) evaluation involves far more complexity than merely presenting performance measures in a table. Researchers often need to compare multiple models across various dimensions, such as the Precision-Recall trade-off and response time, to understand the reasons behind the varying performance of specific queries for different models. We introduce ASPIRE (Assistive System for Performance Evaluation in IR), a visual analytics tool designed to address these complexities by providing an extensive and user-friendly interface for in-depth analysis of IR experiments. ASPIRE supports four key aspects of IR experiment evaluation and analysis: single/multi-experiment comparisons, query-level analysis, query characteristics-performance interplay, and collection-based retrieval analysis. We showcase the functionality of ASPIRE using the TREC Clinical Trials collection. ASPIRE is an open-source toolkit available online (https://github.com/GiorgosPeikos/ASPIRE).

Peikos, G., Kusa, W., Symeonidis, S. (2025). ASPIRE: Assistive System for Performance Evaluation in IR. In Advances in Information Retrieval 47th European Conference on Information Retrieval, ECIR 2025, Lucca, Italy, April 6–10, 2025, Proceedings, Part V (pp.65-71). Springer Science and Business Media Deutschland GmbH [10.1007/978-3-031-88720-8_12].

ASPIRE: Assistive System for Performance Evaluation in IR

Peikos G.;
2025

Abstract

Information Retrieval (IR) evaluation involves far more complexity than merely presenting performance measures in a table. Researchers often need to compare multiple models across various dimensions, such as the Precision-Recall trade-off and response time, to understand the reasons behind the varying performance of specific queries for different models. We introduce ASPIRE (Assistive System for Performance Evaluation in IR), a visual analytics tool designed to address these complexities by providing an extensive and user-friendly interface for in-depth analysis of IR experiments. ASPIRE supports four key aspects of IR experiment evaluation and analysis: single/multi-experiment comparisons, query-level analysis, query characteristics-performance interplay, and collection-based retrieval analysis. We showcase the functionality of ASPIRE using the TREC Clinical Trials collection. ASPIRE is an open-source toolkit available online (https://github.com/GiorgosPeikos/ASPIRE).
paper
Interactive dashboard; IR evaluation; Visual analytics;
English
47th European Conference on Information Retrieval, ECIR 2025 - April 6–10, 2025
2025
Advances in Information Retrieval 47th European Conference on Information Retrieval, ECIR 2025, Lucca, Italy, April 6–10, 2025, Proceedings, Part V
9783031887192
2025
15576 LNCS
65
71
none
Peikos, G., Kusa, W., Symeonidis, S. (2025). ASPIRE: Assistive System for Performance Evaluation in IR. In Advances in Information Retrieval 47th European Conference on Information Retrieval, ECIR 2025, Lucca, Italy, April 6–10, 2025, Proceedings, Part V (pp.65-71). Springer Science and Business Media Deutschland GmbH [10.1007/978-3-031-88720-8_12].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/582681
Citazioni
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
Social impact