[go: up one dir, main page]

Context-Aware Representations for Knowledge Base Relation Extraction

Daniil Sorokin, Iryna Gurevych


Abstract
We demonstrate that for sentence-level relation extraction it is beneficial to consider other relations in the sentential context while predicting the target relation. Our architecture uses an LSTM-based encoder to jointly learn representations for all relations in a single sentence. We combine the context representations with an attention mechanism to make the final prediction. We use the Wikidata knowledge base to construct a dataset of multiple relations per sentence and to evaluate our approach. Compared to a baseline system, our method results in an average error reduction of 24 on a held-out set of relations. The code and the dataset to replicate the experiments are made available at https://github.com/ukplab/.
Anthology ID:
D17-1188
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1784–1789
Language:
URL:
https://aclanthology.org/D17-1188
DOI:
10.18653/v1/D17-1188
Bibkey:
Cite (ACL):
Daniil Sorokin and Iryna Gurevych. 2017. Context-Aware Representations for Knowledge Base Relation Extraction. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 1784–1789, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Context-Aware Representations for Knowledge Base Relation Extraction (Sorokin & Gurevych, EMNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/D17-1188.pdf