[go: up one dir, main page]

Counterfactual Data Augmentation for Neural Machine Translation

Qi Liu, Matt Kusner, Phil Blunsom


Abstract
We propose a data augmentation method for neural machine translation. It works by interpreting language models and phrasal alignment causally. Specifically, it creates augmented parallel translation corpora by generating (path-specific) counterfactual aligned phrases. We generate these by sampling new source phrases from a masked language model, then sampling an aligned counterfactual target phrase by noting that a translation language model can be interpreted as a Gumbel-Max Structural Causal Model (Oberst and Sontag, 2019). Compared to previous work, our method takes both context and alignment into account to maintain the symmetry between source and target sequences. Experiments on IWSLT’15 English → Vietnamese, WMT’17 English → German, WMT’18 English → Turkish, and WMT’19 robust English → French show that the method can improve the performance of translation, backtranslation and translation robustness.
Anthology ID:
2021.naacl-main.18
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
187–197
Language:
URL:
https://aclanthology.org/2021.naacl-main.18
DOI:
10.18653/v1/2021.naacl-main.18
Bibkey:
Cite (ACL):
Qi Liu, Matt Kusner, and Phil Blunsom. 2021. Counterfactual Data Augmentation for Neural Machine Translation. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 187–197, Online. Association for Computational Linguistics.
Cite (Informal):
Counterfactual Data Augmentation for Neural Machine Translation (Liu et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.18.pdf
Video:
 https://aclanthology.org/2021.naacl-main.18.mp4
Code
 marziehf/DataAugmentationNMT
Data
C4