[go: up one dir, main page]

Diversifying Neural Conversation Model with Maximal Marginal Relevance

Yiping Song, Zhiliang Tian, Dongyan Zhao, Ming Zhang, Rui Yan


Abstract
Neural conversation systems, typically using sequence-to-sequence (seq2seq) models, are showing promising progress recently. However, traditional seq2seq suffer from a severe weakness: during beam search decoding, they tend to rank universal replies at the top of the candidate list, resulting in the lack of diversity among candidate replies. Maximum Marginal Relevance (MMR) is a ranking algorithm that has been widely used for subset selection. In this paper, we propose the MMR-BS decoding method, which incorporates MMR into the beam search (BS) process of seq2seq. The MMR-BS method improves the diversity of generated replies without sacrificing their high relevance with the user-issued query. Experiments show that our proposed model achieves the best performance among other comparison methods.
Anthology ID:
I17-2029
Volume:
Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
November
Year:
2017
Address:
Taipei, Taiwan
Editors:
Greg Kondrak, Taro Watanabe
Venue:
IJCNLP
SIG:
Publisher:
Asian Federation of Natural Language Processing
Note:
Pages:
169–174
Language:
URL:
https://aclanthology.org/I17-2029
DOI:
Bibkey:
Cite (ACL):
Yiping Song, Zhiliang Tian, Dongyan Zhao, Ming Zhang, and Rui Yan. 2017. Diversifying Neural Conversation Model with Maximal Marginal Relevance. In Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 169–174, Taipei, Taiwan. Asian Federation of Natural Language Processing.
Cite (Informal):
Diversifying Neural Conversation Model with Maximal Marginal Relevance (Song et al., IJCNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/I17-2029.pdf