We give an update on the Found in Translation (FoTran) project, focusing on the study of emerging language-agnostic representations from neural machine translation (NMT). We describe our attention-bridge model, a modular NMT model which connects language-specific components through a shared network layer. Our latest implementation supports distributed training over many nodes and GPUs in order to substantially scale up the number of languages that can be included in a modern neural translation architecture.
Vazquez, R., Boggia, M., Raganato, A., Loppi, N., Gronroos, S., Tiedemann, J. (2022). Latest Development in the FoTran Project - Scaling Up Language Coverage in Neural Machine Translation Using Distributed Training with Language-Specific Components. In EAMT 2022 - Proceedings of the 23rd Annual Conference of the European Association for Machine Translation (pp.311-312).
Latest Development in the FoTran Project - Scaling Up Language Coverage in Neural Machine Translation Using Distributed Training with Language-Specific Components
Raganato A.;
2022
Abstract
We give an update on the Found in Translation (FoTran) project, focusing on the study of emerging language-agnostic representations from neural machine translation (NMT). We describe our attention-bridge model, a modular NMT model which connects language-specific components through a shared network layer. Our latest implementation supports distributed training over many nodes and GPUs in order to substantially scale up the number of languages that can be included in a modern neural translation architecture.File | Dimensione | Formato | |
---|---|---|---|
Vazquez-2022-EAMT-VoR.pdf
accesso aperto
Descrizione: Intervento a convegno
Tipologia di allegato:
Publisher’s Version (Version of Record, VoR)
Dimensione
223.41 kB
Formato
Adobe PDF
|
223.41 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.