[go: up one dir, main page]

TREND: Trigger-Enhanced Relation-Extraction Network for Dialogues

Po-Wei Lin, Shang-Yu Su, Yun-Nung Chen


Abstract
The goal of dialogue relation extraction (DRE) is to identify the relation between two entities in a given dialogue. During conversations, speakers may expose their relations to certain entities by explicit or implicit clues, such evidences called “triggers”. However, trigger annotations may not be always available for the target data, so it is challenging to leverage such information for enhancing the performance. Therefore, this paper proposes to learn how to identify triggers from the data with trigger annotations and then transfers the trigger-finding capability to other datasets for better performance. The experiments show that the proposed approach is capable of improving relation extraction performance of unseen relations and also demonstrate the transferability of our proposed trigger-finding model across different domains and datasets.
Anthology ID:
2022.sigdial-1.58
Volume:
Proceedings of the 23rd Annual Meeting of the Special Interest Group on Discourse and Dialogue
Month:
September
Year:
2022
Address:
Edinburgh, UK
Editors:
Oliver Lemon, Dilek Hakkani-Tur, Junyi Jessy Li, Arash Ashrafzadeh, Daniel Hernández Garcia, Malihe Alikhani, David Vandyke, Ondřej Dušek
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
623–629
Language:
URL:
https://aclanthology.org/2022.sigdial-1.58
DOI:
10.18653/v1/2022.sigdial-1.58
Bibkey:
Cite (ACL):
Po-Wei Lin, Shang-Yu Su, and Yun-Nung Chen. 2022. TREND: Trigger-Enhanced Relation-Extraction Network for Dialogues. In Proceedings of the 23rd Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 623–629, Edinburgh, UK. Association for Computational Linguistics.
Cite (Informal):
TREND: Trigger-Enhanced Relation-Extraction Network for Dialogues (Lin et al., SIGDIAL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.sigdial-1.58.pdf
Code
 miulab/trend
Data
DDRelDialogRE