Few-shot Fine-grained Entity Typing (FET) techniques are applied to classify entities recognized in texts into classes from taxonomies using a limited amount of training data. The task supports entity extraction and knowledge graph (KG) construction by classifying novel entities according to a taxonomy of interest. In this paper, we present PROMET (PROmpt-tuning using implicit Mask filling for Entity Typing), a novel parameter-efficient prompt-based approach to few-shot FET that exploits implicit mask filling. The usage of the embedding of the masked tokens avoids the necessity of a manually defined verbalizer to initialize the matrix which maps the predicted token to entity classes. By directly fine-tuning Adapters and a linear layer instead of 1) the whole PLM and 2) the verbalizer matrix, PROMET uses a number of trainable parameters that is two orders of magnitude smaller than existing models, achieving better or comparable performance on benchmark datasets. Finally, we develop PROMET (and modify the state-of-the-art few-shot FET approach) to work in multi-label inference settings, coherently with earlier work in the field and with typing patterns in KGs.

Rubini, R., Vimercati, M., Palmonari, M. (2024). PROMET: Parameter-Efficient Few-Shot Fine-Grained Entity Typing with Implicit Mask Filling. In 2024 IEEE/WIC International Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT) (pp.141-149). Institute of Electrical and Electronics Engineers Inc. [10.1109/WI-IAT62293.2024.00027].

PROMET: Parameter-Efficient Few-Shot Fine-Grained Entity Typing with Implicit Mask Filling

Vimercati M.;Palmonari M.
2024

Abstract

Few-shot Fine-grained Entity Typing (FET) techniques are applied to classify entities recognized in texts into classes from taxonomies using a limited amount of training data. The task supports entity extraction and knowledge graph (KG) construction by classifying novel entities according to a taxonomy of interest. In this paper, we present PROMET (PROmpt-tuning using implicit Mask filling for Entity Typing), a novel parameter-efficient prompt-based approach to few-shot FET that exploits implicit mask filling. The usage of the embedding of the masked tokens avoids the necessity of a manually defined verbalizer to initialize the matrix which maps the predicted token to entity classes. By directly fine-tuning Adapters and a linear layer instead of 1) the whole PLM and 2) the verbalizer matrix, PROMET uses a number of trainable parameters that is two orders of magnitude smaller than existing models, achieving better or comparable performance on benchmark datasets. Finally, we develop PROMET (and modify the state-of-the-art few-shot FET approach) to work in multi-label inference settings, coherently with earlier work in the field and with typing patterns in KGs.
paper
artificial intelligence; fine-grained entity typing; green computing; knowledge base population; language models; named entity recognition; nlp; semantic web;
English
2024 IEEE/WIC International Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT) - 09-12 December 2024
2024
2024 IEEE/WIC International Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT)
9798331504946
2024
141
149
open
Rubini, R., Vimercati, M., Palmonari, M. (2024). PROMET: Parameter-Efficient Few-Shot Fine-Grained Entity Typing with Implicit Mask Filling. In 2024 IEEE/WIC International Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT) (pp.141-149). Institute of Electrical and Electronics Engineers Inc. [10.1109/WI-IAT62293.2024.00027].
File in questo prodotto:
File Dimensione Formato  
Rubini-2024-IEEE_WIC-preprint.pdf

accesso aperto

Tipologia di allegato: Submitted Version (Pre-print)
Licenza: Creative Commons
Dimensione 1.44 MB
Formato Adobe PDF
1.44 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/562250
Citazioni
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
Social impact