[go: up one dir, main page]

Knowledge-Enhanced Natural Language Inference Based on Knowledge Graphs

Zikang Wang, Linjing Li, Daniel Zeng


Abstract
Natural Language Inference (NLI) is a vital task in natural language processing. It aims to identify the logical relationship between two sentences. Most of the existing approaches make such inference based on semantic knowledge obtained through training corpus. The adoption of background knowledge is rarely seen or limited to a few specific types. In this paper, we propose a novel Knowledge Graph-enhanced NLI (KGNLI) model to leverage the usage of background knowledge stored in knowledge graphs in the field of NLI. KGNLI model consists of three components: a semantic-relation representation module, a knowledge-relation representation module, and a label prediction module. Different from previous methods, various kinds of background knowledge can be flexibly combined in the proposed KGNLI model. Experiments on four benchmarks, SNLI, MultiNLI, SciTail, and BNLI, validate the effectiveness of our model.
Anthology ID:
2020.coling-main.571
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
6498–6508
Language:
URL:
https://aclanthology.org/2020.coling-main.571
DOI:
10.18653/v1/2020.coling-main.571
Bibkey:
Cite (ACL):
Zikang Wang, Linjing Li, and Daniel Zeng. 2020. Knowledge-Enhanced Natural Language Inference Based on Knowledge Graphs. In Proceedings of the 28th International Conference on Computational Linguistics, pages 6498–6508, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Knowledge-Enhanced Natural Language Inference Based on Knowledge Graphs (Wang et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.571.pdf
Data
MultiNLISNLI