In this paper, we propose an architecture for machine translation (MT) capable of obtaining multilingual sentence representations by incorporating an intermediate attention bridge that is shared across all languages. We train the model with language-specific encoders and decoders that are connected through an inner-attention layer on the encoder side. The attention bridge exploits the semantics from each language for translation and develops into a language-agnostic meaning representation that can efficiently be used for transfer learning. We present a new framework for the efficient development of multilingual neural machine translation (NMT) using this model and scheduled training. We have tested the approach in a systematic way with a multi-parallel data set. The model achieves substantial improvements over strong bilingual models and performs well for zero-shot translation, which demonstrates its ability of abstraction and transfer learning.

Vázquez, R., Raganato, A., Tiedemann, J., Creutz, M. (2019). Multilingual NMT with a Language-Independent Attention Bridge. In Proceedings of the 4th Workshop on Representation Learning for NLP (RepL4NLP-2019) (pp.33-39) [10.18653/v1/W19-4305].

Multilingual NMT with a Language-Independent Attention Bridge

Raganato, Alessandro
;
2019

Abstract

In this paper, we propose an architecture for machine translation (MT) capable of obtaining multilingual sentence representations by incorporating an intermediate attention bridge that is shared across all languages. We train the model with language-specific encoders and decoders that are connected through an inner-attention layer on the encoder side. The attention bridge exploits the semantics from each language for translation and develops into a language-agnostic meaning representation that can efficiently be used for transfer learning. We present a new framework for the efficient development of multilingual neural machine translation (NMT) using this model and scheduled training. We have tested the approach in a systematic way with a multi-parallel data set. The model achieves substantial improvements over strong bilingual models and performs well for zero-shot translation, which demonstrates its ability of abstraction and transfer learning.
paper
attention bridge; machine translation; sentence representation
English
The 4th Workshop on Representation Learning for NLP (RepL4NLP-2019)
2019
Proceedings of the 4th Workshop on Representation Learning for NLP (RepL4NLP-2019)
978-1-950737-35-2
2019
33
39
reserved
Vázquez, R., Raganato, A., Tiedemann, J., Creutz, M. (2019). Multilingual NMT with a Language-Independent Attention Bridge. In Proceedings of the 4th Workshop on Representation Learning for NLP (RepL4NLP-2019) (pp.33-39) [10.18653/v1/W19-4305].
File in questo prodotto:
File Dimensione Formato  
W19-4305.pdf

Solo gestori archivio

Dimensione 372.83 kB
Formato Adobe PDF
372.83 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/361575
Citazioni
  • Scopus 27
  • ???jsp.display-item.citation.isi??? 9
Social impact