[go: up one dir, main page]

Towards Incremental Transformers: An Empirical Analysis of Transformer Models for Incremental NLU

Patrick Kahardipraja, Brielen Madureira, David Schlangen


Abstract
Incremental processing allows interactive systems to respond based on partial inputs, which is a desirable property e.g. in dialogue agents. The currently popular Transformer architecture inherently processes sequences as a whole, abstracting away the notion of time. Recent work attempts to apply Transformers incrementally via restart-incrementality by repeatedly feeding, to an unchanged model, increasingly longer input prefixes to produce partial outputs. However, this approach is computationally costly and does not scale efficiently for long sequences. In parallel, we witness efforts to make Transformers more efficient, e.g. the Linear Transformer (LT) with a recurrence mechanism. In this work, we examine the feasibility of LT for incremental NLU in English. Our results show that the recurrent LT model has better incremental performance and faster inference speed compared to the standard Transformer and LT with restart-incrementality, at the cost of part of the non-incremental (full sequence) quality. We show that the performance drop can be mitigated by training the model to wait for right context before committing to an output and that training with input prefixes is beneficial for delivering correct partial outputs.
Anthology ID:
2021.emnlp-main.90
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1178–1189
Language:
URL:
https://aclanthology.org/2021.emnlp-main.90
DOI:
10.18653/v1/2021.emnlp-main.90
Bibkey:
Cite (ACL):
Patrick Kahardipraja, Brielen Madureira, and David Schlangen. 2021. Towards Incremental Transformers: An Empirical Analysis of Transformer Models for Incremental NLU. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 1178–1189, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Towards Incremental Transformers: An Empirical Analysis of Transformer Models for Incremental NLU (Kahardipraja et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.90.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.90.mp4
Code
 pkhdipraja/towards-incremental-transformers
Data
ATISSNIPS