We give an update on the Found in Translation (FoTran) project, focusing on the study of emerging language-agnostic representations from neural machine translation (NMT). We describe our attention-bridge model, a modular NMT model which connects language-specific components through a shared network layer. Our latest implementation supports distributed training over many nodes and GPUs in order to substantially scale up the number of languages that can be included in a modern neural translation architecture.

Vazquez, R., Boggia, M., Raganato, A., Loppi, N., Gronroos, S., Tiedemann, J. (2022). Latest Development in the FoTran Project - Scaling Up Language Coverage in Neural Machine Translation Using Distributed Training with Language-Specific Components. In EAMT 2022 - Proceedings of the 23rd Annual Conference of the European Association for Machine Translation (pp.311-312).

Latest Development in the FoTran Project - Scaling Up Language Coverage in Neural Machine Translation Using Distributed Training with Language-Specific Components

Raganato A.;
2022

Abstract

We give an update on the Found in Translation (FoTran) project, focusing on the study of emerging language-agnostic representations from neural machine translation (NMT). We describe our attention-bridge model, a modular NMT model which connects language-specific components through a shared network layer. Our latest implementation supports distributed training over many nodes and GPUs in order to substantially scale up the number of languages that can be included in a modern neural translation architecture.
paper
Computational linguistics; Computer aided language translation; Machine components; Network layers; Program processors; Bridge model; Implementation support; Latest development; Machine translation models; Modulars; Scale-up; Scaling-up; Shared network; Specific component; Neural machine translation
English
23rd Annual Conference of the European Association for Machine Translation, EAMT 2022 - 1 June 2022 through 3 June 2022
2022
Macken, L; Rufener, A; an den Bogaert, J; Daems, J; Tezcan, A; Vanroy, B; Fonteyne, M; Barrault, L; Costa-Jussa, MR; Kemp, E; Pilos, E; Declercq, C; Koponen, M; Forcada, ML; Scarton, C; Moniz, H
EAMT 2022 - Proceedings of the 23rd Annual Conference of the European Association for Machine Translation
9789464597622
2022
311
312
open
Vazquez, R., Boggia, M., Raganato, A., Loppi, N., Gronroos, S., Tiedemann, J. (2022). Latest Development in the FoTran Project - Scaling Up Language Coverage in Neural Machine Translation Using Distributed Training with Language-Specific Components. In EAMT 2022 - Proceedings of the 23rd Annual Conference of the European Association for Machine Translation (pp.311-312).
File in questo prodotto:
File Dimensione Formato  
Vazquez-2022-EAMT-VoR.pdf

accesso aperto

Descrizione: Intervento a convegno
Tipologia di allegato: Publisher’s Version (Version of Record, VoR)
Dimensione 223.41 kB
Formato Adobe PDF
223.41 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/394431
Citazioni
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
Social impact