[go: up one dir, main page]

Context-Aware Cross-Attention for Non-Autoregressive Translation

Liang Ding, Longyue Wang, Di Wu, Dacheng Tao, Zhaopeng Tu


Abstract
Non-autoregressive translation (NAT) significantly accelerates the inference process by predicting the entire target sequence. However, due to the lack of target dependency modelling in the decoder, the conditional generation process heavily depends on the cross-attention. In this paper, we reveal a localness perception problem in NAT cross-attention, for which it is difficult to adequately capture source context. To alleviate this problem, we propose to enhance signals of neighbour source tokens into conventional cross-attention. Experimental results on several representative datasets show that our approach can consistently improve translation quality over strong NAT baselines. Extensive analyses demonstrate that the enhanced cross-attention achieves better exploitation of source contexts by leveraging both local and global information.
Anthology ID:
2020.coling-main.389
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
4396–4402
Language:
URL:
https://aclanthology.org/2020.coling-main.389
DOI:
10.18653/v1/2020.coling-main.389
Bibkey:
Cite (ACL):
Liang Ding, Longyue Wang, Di Wu, Dacheng Tao, and Zhaopeng Tu. 2020. Context-Aware Cross-Attention for Non-Autoregressive Translation. In Proceedings of the 28th International Conference on Computational Linguistics, pages 4396–4402, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Context-Aware Cross-Attention for Non-Autoregressive Translation (Ding et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.389.pdf