[go: up one dir, main page]

Layer-Wise Multi-View Learning for Neural Machine Translation

Qiang Wang, Changliang Li, Yue Zhang, Tong Xiao, Jingbo Zhu


Abstract
Traditional neural machine translation is limited to the topmost encoder layer’s context representation and cannot directly perceive the lower encoder layers. Existing solutions usually rely on the adjustment of network architecture, making the calculation more complicated or introducing additional structural restrictions. In this work, we propose layer-wise multi-view learning to solve this problem, circumventing the necessity to change the model structure. We regard each encoder layer’s off-the-shelf output, a by-product in layer-by-layer encoding, as the redundant view for the input sentence. In this way, in addition to the topmost encoder layer (referred to as the primary view), we also incorporate an intermediate encoder layer as the auxiliary view. We feed the two views to a partially shared decoder to maintain independent prediction. Consistency regularization based on KL divergence is used to encourage the two views to learn from each other. Extensive experimental results on five translation tasks show that our approach yields stable improvements over multiple strong baselines. As another bonus, our method is agnostic to network architectures and can maintain the same inference speed as the original model.
Anthology ID:
2020.coling-main.377
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
4275–4286
Language:
URL:
https://aclanthology.org/2020.coling-main.377
DOI:
10.18653/v1/2020.coling-main.377
Bibkey:
Cite (ACL):
Qiang Wang, Changliang Li, Yue Zhang, Tong Xiao, and Jingbo Zhu. 2020. Layer-Wise Multi-View Learning for Neural Machine Translation. In Proceedings of the 28th International Conference on Computational Linguistics, pages 4275–4286, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Layer-Wise Multi-View Learning for Neural Machine Translation (Wang et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.377.pdf