Dense Retrieval Models (DRMs) estimate the semantic similarity between queries and documents based on their embeddings. Prior studies highlight the importance of embedding contextualization in enhancing retrieval performance. To this aim, existing approaches primarily leverage token-level information derived from query/document interactions. In this paper, we introduce a novel DRM, namely DenseC3, which leverages query/document interactions based on the full embedding representations generated by a Transformer-based model. To enhance similarity estimation, DenseC3 integrates external linguistic information about the Cognitive Complexity of texts, enriching the contextualization of embeddings. We empirically evaluate our approach across seven benchmarks and three different IR tasks to assess the impact of Cognitive Complexity-aware query and document embeddings for contextualization in dense retrieval. Results show that our approach consistently outperforms standard fine-tuning techniques on lightweight bi-encoders (e.g., BERT-based) and traditional late-interaction models (i.e., ColBERT) across all benchmarks. On larger retrieval-optimized bi-encoders like Contriever, our model achieves comparable or higher performance on four of the considered evaluation benchmarks. Our findings suggest that Cognitive Complexity-aware embeddings enhance query and document representations, improving retrieval effectiveness in DRMs. Our code is available online at: https://github.com/FaySokli/DenseC3.

Sokli, E., Peikos, G., Kasela, P., Pasi, G. (2025). Leveraging Cognitive Complexity of Texts for Contextualization in Dense Retrieval. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing (pp.27083-27096). Association for Computational Linguistics [10.18653/v1/2025.emnlp-main.1377].

Leveraging Cognitive Complexity of Texts for Contextualization in Dense Retrieval

Sokli Effrosyni
Primo
;
Peikos Georgios
Secondo
;
Kasela Pranav;Pasi Gabriella
Ultimo
2025

Abstract

Dense Retrieval Models (DRMs) estimate the semantic similarity between queries and documents based on their embeddings. Prior studies highlight the importance of embedding contextualization in enhancing retrieval performance. To this aim, existing approaches primarily leverage token-level information derived from query/document interactions. In this paper, we introduce a novel DRM, namely DenseC3, which leverages query/document interactions based on the full embedding representations generated by a Transformer-based model. To enhance similarity estimation, DenseC3 integrates external linguistic information about the Cognitive Complexity of texts, enriching the contextualization of embeddings. We empirically evaluate our approach across seven benchmarks and three different IR tasks to assess the impact of Cognitive Complexity-aware query and document embeddings for contextualization in dense retrieval. Results show that our approach consistently outperforms standard fine-tuning techniques on lightweight bi-encoders (e.g., BERT-based) and traditional late-interaction models (i.e., ColBERT) across all benchmarks. On larger retrieval-optimized bi-encoders like Contriever, our model achieves comparable or higher performance on four of the considered evaluation benchmarks. Our findings suggest that Cognitive Complexity-aware embeddings enhance query and document representations, improving retrieval effectiveness in DRMs. Our code is available online at: https://github.com/FaySokli/DenseC3.
paper
Dense Information Retrieval
English
Conference on Empirical Methods in Natural Language Processing
2025
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
9798891763326
2025
27083
27096
https://aclanthology.org/2025.emnlp-main.1377/
open
Sokli, E., Peikos, G., Kasela, P., Pasi, G. (2025). Leveraging Cognitive Complexity of Texts for Contextualization in Dense Retrieval. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing (pp.27083-27096). Association for Computational Linguistics [10.18653/v1/2025.emnlp-main.1377].
File in questo prodotto:
File Dimensione Formato  
Sokli-2025-Proceed 2025 Conf Empir Methods Natural Language Process-VoR.pdf

accesso aperto

Tipologia di allegato: Publisher’s Version (Version of Record, VoR)
Licenza: Creative Commons
Dimensione 8.08 MB
Formato Adobe PDF
8.08 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/588564
Citazioni
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
Social impact