The utilization of word embeddings—powerful models computed through Neural Network architectures that encode words as vectors—has witnessed rapid growth across various Natural Language Processing applications, encompassing semantic analysis, information retrieval, dependency parsing, question answering, and machine translation. The efficacy of these tasks is strictly linked to the quality of the embeddings, underscoring the critical importance of evaluating and selecting optimal embedding models. While established procedures and benchmarks exist for intrinsic evaluation, the authors note a conspicuous absence of comprehensive evaluations of intrinsic embedding quality across multiple tasks. This paper introduces vec2best, a unified tool encompassing state-of-the-art intrinsic evaluation tasks across diverse benchmarks. vec2best furnishes the user with an extensive evaluation of word embedding models. It represents a framework for evaluating word embeddings trained using various methods and hyper-parameters on a range of tasks from the literature. The tool yields a holistic evaluation metric for each model called the PCE (Principal Component Evaluation). We conducted evaluations on 135 word embedding models, trained using GloVe, fastText, and word2vec, across four tasks integrated into vec2best (similarity, analogy, categorization, and outlier detection), along with their respective benchmarks. Additionally, we leveraged vec2best to optimize embedding hyper-parameter configurations in a real-world scenario. vec2best is conveniently accessible as a pip-installable Python package.

Ascari, R., Giabelli, A., Malandri, L., Mercorio, F., Mezzanzanica, M. (2024). A Fistful of Vectors: A Tool for Intrinsic Evaluation of Word Embeddings. COGNITIVE COMPUTATION [10.1007/s12559-023-10235-3].

A Fistful of Vectors: A Tool for Intrinsic Evaluation of Word Embeddings

Ascari, Roberto;Giabelli, Anna
;
Malandri, Lorenzo;Mercorio, Fabio;Mezzanzanica, Mario
2024

Abstract

The utilization of word embeddings—powerful models computed through Neural Network architectures that encode words as vectors—has witnessed rapid growth across various Natural Language Processing applications, encompassing semantic analysis, information retrieval, dependency parsing, question answering, and machine translation. The efficacy of these tasks is strictly linked to the quality of the embeddings, underscoring the critical importance of evaluating and selecting optimal embedding models. While established procedures and benchmarks exist for intrinsic evaluation, the authors note a conspicuous absence of comprehensive evaluations of intrinsic embedding quality across multiple tasks. This paper introduces vec2best, a unified tool encompassing state-of-the-art intrinsic evaluation tasks across diverse benchmarks. vec2best furnishes the user with an extensive evaluation of word embedding models. It represents a framework for evaluating word embeddings trained using various methods and hyper-parameters on a range of tasks from the literature. The tool yields a holistic evaluation metric for each model called the PCE (Principal Component Evaluation). We conducted evaluations on 135 word embedding models, trained using GloVe, fastText, and word2vec, across four tasks integrated into vec2best (similarity, analogy, categorization, and outlier detection), along with their respective benchmarks. Additionally, we leveraged vec2best to optimize embedding hyper-parameter configurations in a real-world scenario. vec2best is conveniently accessible as a pip-installable Python package.
Articolo in rivista - Articolo scientifico
Intrinsic evaluation; Natural language processing; Word embeddings;
English
22-gen-2024
2024
none
Ascari, R., Giabelli, A., Malandri, L., Mercorio, F., Mezzanzanica, M. (2024). A Fistful of Vectors: A Tool for Intrinsic Evaluation of Word Embeddings. COGNITIVE COMPUTATION [10.1007/s12559-023-10235-3].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/457238
Citazioni
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 1
Social impact