[go: up one dir, main page]

When do Contrastive Word Alignments Improve Many-to-many Neural Machine Translation?

Zhuoyuan Mao, Chenhui Chu, Raj Dabre, Haiyue Song, Zhen Wan, Sadao Kurohashi


Abstract
Word alignment has proven to benefit many-to-many neural machine translation (NMT). However, high-quality ground-truth bilingual dictionaries were used for pre-editing in previous methods, which are unavailable for most language pairs. Meanwhile, the contrastive objective can implicitly utilize automatically learned word alignment, which has not been explored in many-to-many NMT. This work proposes a word-level contrastive objective to leverage word alignments for many-to-many NMT. Empirical results show that this leads to 0.8 BLEU gains for several language pairs. Analyses reveal that in many-to-many NMT, the encoder’s sentence retrieval performance highly correlates with the translation quality, which explains when the proposed method impacts translation. This motivates future exploration for many-to-many NMT to improve the encoder’s sentence retrieval performance.
Anthology ID:
2022.findings-naacl.134
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1766–1775
Language:
URL:
https://aclanthology.org/2022.findings-naacl.134
DOI:
10.18653/v1/2022.findings-naacl.134
Bibkey:
Cite (ACL):
Zhuoyuan Mao, Chenhui Chu, Raj Dabre, Haiyue Song, Zhen Wan, and Sadao Kurohashi. 2022. When do Contrastive Word Alignments Improve Many-to-many Neural Machine Translation?. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 1766–1775, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
When do Contrastive Word Alignments Improve Many-to-many Neural Machine Translation? (Mao et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.134.pdf
Video:
 https://aclanthology.org/2022.findings-naacl.134.mp4
Data
word2word