[go: up one dir, main page]

Fine-grained Temporal Relation Extraction with Ordered-Neuron LSTM and Graph Convolutional Networks

Minh Tran Phu, Minh Van Nguyen, Thien Huu Nguyen


Abstract
Fine-grained temporal relation extraction (FineTempRel) aims to recognize the durations and timeline of event mentions in text. A missing part in the current deep learning models for FineTempRel is their failure to exploit the syntactic structures of the input sentences to enrich the representation vectors. In this work, we propose to fill this gap by introducing novel methods to integrate the syntactic structures into the deep learning models for FineTempRel. The proposed model focuses on two types of syntactic information from the dependency trees, i.e., the syntax-based importance scores for representation learning of the words and the syntactic connections to identify important context words for the event mentions. We also present two novel techniques to facilitate the knowledge transfer between the subtasks of FineTempRel, leading to a novel model with the state-of-the-art performance for this task.
Anthology ID:
2021.wnut-1.5
Volume:
Proceedings of the Seventh Workshop on Noisy User-generated Text (W-NUT 2021)
Month:
November
Year:
2021
Address:
Online
Editors:
Wei Xu, Alan Ritter, Tim Baldwin, Afshin Rahimi
Venue:
WNUT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
35–45
Language:
URL:
https://aclanthology.org/2021.wnut-1.5
DOI:
10.18653/v1/2021.wnut-1.5
Bibkey:
Cite (ACL):
Minh Tran Phu, Minh Van Nguyen, and Thien Huu Nguyen. 2021. Fine-grained Temporal Relation Extraction with Ordered-Neuron LSTM and Graph Convolutional Networks. In Proceedings of the Seventh Workshop on Noisy User-generated Text (W-NUT 2021), pages 35–45, Online. Association for Computational Linguistics.
Cite (Informal):
Fine-grained Temporal Relation Extraction with Ordered-Neuron LSTM and Graph Convolutional Networks (Tran Phu et al., WNUT 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.wnut-1.5.pdf
Video:
 https://aclanthology.org/2021.wnut-1.5.mp4