Recent psycholinguistic theories emphasize the interdependence between linguistic expectations and memory limitations in human language processing. We modify the self-attention mechanism of a transformer model to simulate a lossy context representation, biasing the model’s predictions to give additional weight to the local linguistic context. We show that surprisal estimates from our locally-biased model generally provide a better fit to human psychometric data, underscoring the sensitivity of the human parser to local linguistic information.

de Varda, A., Marelli, M. (2024). Locally Biased Transformers Better Align with Human Reading Times. In CMCL 2024 - 13th Edition of the Workshop on Cognitive Modeling and Computational Linguistics, Proceedings of the Workshop (pp.30-36). Association for Computational Linguistics (ACL).

Locally Biased Transformers Better Align with Human Reading Times

de Varda A. G.;Marelli M.
2024

Abstract

Recent psycholinguistic theories emphasize the interdependence between linguistic expectations and memory limitations in human language processing. We modify the self-attention mechanism of a transformer model to simulate a lossy context representation, biasing the model’s predictions to give additional weight to the local linguistic context. We show that surprisal estimates from our locally-biased model generally provide a better fit to human psychometric data, underscoring the sensitivity of the human parser to local linguistic information.
paper
reading, llms
English
13th Edition of the Workshop on Cognitive Modeling and Computational Linguistics, CMCL 2024 - 15 August 2024
2024
CMCL 2024 - 13th Edition of the Workshop on Cognitive Modeling and Computational Linguistics, Proceedings of the Workshop
9798891761438
2024
30
36
none
de Varda, A., Marelli, M. (2024). Locally Biased Transformers Better Align with Human Reading Times. In CMCL 2024 - 13th Edition of the Workshop on Cognitive Modeling and Computational Linguistics, Proceedings of the Workshop (pp.30-36). Association for Computational Linguistics (ACL).
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/546975
Citazioni
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
Social impact