Large Language Models (LLMs) have shown impressive zero-shot performance across a variety of Natural Language Processing tasks, including document re-ranking. However, their effectiveness degrades on unseen tasks and domains, largely due to shifts in vocabulary and word distributions. In this paper, we investigate Task Arithmetic, a technique that combines the weights of LLMs pre-trained on different tasks or domains via simple mathematical operations, such as addition or subtraction, to adapt retrieval models without requiring additional fine-tuning. Our method is able to synthesize diverse tasks and domain knowledge into a single model, enabling effective zero-shot adaptation in different retrieval contexts. Extensive experiments on publicly available scientific, biomedical, and multilingual datasets show that our method improves state-of-the-art re-ranking performance by up to 18% in NDCG@10 and 15% in P@10. In addition to these empirical gains, our analysis provides insights into the strengths and limitations of Task Arithmetic as a practical strategy for zero-shot learning and model adaptation. We make our code publicly available at https://github.com/DetectiveMB/Task-Arithmetic-for-ZS-IR.

Braga, M., Kasela, P., Raganato, A., Pasi, G. (2025). Investigating Task Arithmetic for Zero-Shot Information Retrieval. In SIGIR '25: Proceedings of the 48th International ACM SIGIR Conference on Research and Development in Information Retrieval (pp.2738-2743). Association for Computing Machinery, Inc [10.1145/3726302.3730216].

Investigating Task Arithmetic for Zero-Shot Information Retrieval

Braga M.;Kasela P.;Raganato A.;Pasi G.
2025

Abstract

Large Language Models (LLMs) have shown impressive zero-shot performance across a variety of Natural Language Processing tasks, including document re-ranking. However, their effectiveness degrades on unseen tasks and domains, largely due to shifts in vocabulary and word distributions. In this paper, we investigate Task Arithmetic, a technique that combines the weights of LLMs pre-trained on different tasks or domains via simple mathematical operations, such as addition or subtraction, to adapt retrieval models without requiring additional fine-tuning. Our method is able to synthesize diverse tasks and domain knowledge into a single model, enabling effective zero-shot adaptation in different retrieval contexts. Extensive experiments on publicly available scientific, biomedical, and multilingual datasets show that our method improves state-of-the-art re-ranking performance by up to 18% in NDCG@10 and 15% in P@10. In addition to these empirical gains, our analysis provides insights into the strengths and limitations of Task Arithmetic as a practical strategy for zero-shot learning and model adaptation. We make our code publicly available at https://github.com/DetectiveMB/Task-Arithmetic-for-ZS-IR.
paper
Domain-specific and Multilingual IR; Task Arithmetic; Zero-Shot;
English
SIGIR '25: The 48th International ACM SIGIR Conference on Research and Development in Information Retrieval - July 13 - 18, 2025
2025
SIGIR '25: Proceedings of the 48th International ACM SIGIR Conference on Research and Development in Information Retrieval
9798400715921
2025
2738
2743
open
Braga, M., Kasela, P., Raganato, A., Pasi, G. (2025). Investigating Task Arithmetic for Zero-Shot Information Retrieval. In SIGIR '25: Proceedings of the 48th International ACM SIGIR Conference on Research and Development in Information Retrieval (pp.2738-2743). Association for Computing Machinery, Inc [10.1145/3726302.3730216].
File in questo prodotto:
File Dimensione Formato  
Braga-2025-SIGIR 25-VoR.pdf

accesso aperto

Tipologia di allegato: Publisher’s Version (Version of Record, VoR)
Licenza: Creative Commons
Dimensione 1 MB
Formato Adobe PDF
1 MB Adobe PDF Visualizza/Apri
Braga-2025-arXiv-AAM.pdf

accesso aperto

Tipologia di allegato: Author’s Accepted Manuscript, AAM (Post-print)
Licenza: Creative Commons
Dimensione 598.87 kB
Formato Adobe PDF
598.87 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/571241
Citazioni
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
Social impact