[go: up one dir, main page]

Event-Centric Question Answering via Contrastive Learning and Invertible Event Transformation

Junru Lu, Xingwei Tan, Gabriele Pergola, Lin Gui, Yulan He


Abstract
Human reading comprehension often requires reasoning of event semantic relations in narratives, represented by Event-centric Question-Answering (QA). To address event-centric QA, we propose a novel QA model with contrastive learning and invertible event transformation, call TranCLR. Our proposed model utilizes an invertible transformation matrix to project semantic vectors of events into a common event embedding space, trained with contrastive learning, and thus naturally inject event semantic knowledge into mainstream QA pipelines. The transformation matrix is fine-tuned with the annotated event relation types between events that occurred in questions and those in answers, using event-aware question vectors. Experimental results on the Event Semantic Relation Reasoning (ESTER) dataset show significant improvements in both generative and extractive settings compared to the existing strong baselines, achieving over 8.4% gain in the token-level F1 score and 3.0% gain in Exact Match (EM) score under the multi-answer setting. Qualitative analysis reveals the high quality of the generated answers by TranCLR, demonstrating the feasibility of injecting event knowledge into QA model learning. Our code and models can be found at https://github.com/LuJunru/TranCLR.
Anthology ID:
2022.findings-emnlp.176
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2377–2389
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.176
DOI:
10.18653/v1/2022.findings-emnlp.176
Bibkey:
Cite (ACL):
Junru Lu, Xingwei Tan, Gabriele Pergola, Lin Gui, and Yulan He. 2022. Event-Centric Question Answering via Contrastive Learning and Invertible Event Transformation. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 2377–2389, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Event-Centric Question Answering via Contrastive Learning and Invertible Event Transformation (Lu et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.176.pdf
Video:
 https://aclanthology.org/2022.findings-emnlp.176.mp4