[go: up one dir, main page]

Self-Adaptive Named Entity Recognition by Retrieving Unstructured Knowledge

Kosuke Nishida, Naoki Yoshinaga, Kyosuke Nishida


Abstract
Although named entity recognition (NER) helps us to extract domain-specific entities from text (e.g., artists in the music domain), it is costly to create a large amount of training data or a structured knowledge base to perform accurate NER in the target domain. Here, we propose self-adaptive NER, which retrieves external knowledge from unstructured text to learn the usages of entities that have not been learned well. To retrieve useful knowledge for NER, we design an effective two-stage model that retrieves unstructured knowledge using uncertain entities as queries. Our model predicts the entities in the input and then finds those of which the prediction is not confident. Then, it retrieves knowledge by using these uncertain entities as queries and concatenates the retrieved text to the original input to revise the prediction. Experiments on CrossNER datasets demonstrated that our model outperforms strong baselines by 2.35 points in F1 metric.
Anthology ID:
2023.eacl-main.233
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3193–3205
Language:
URL:
https://aclanthology.org/2023.eacl-main.233
DOI:
10.18653/v1/2023.eacl-main.233
Bibkey:
Cite (ACL):
Kosuke Nishida, Naoki Yoshinaga, and Kyosuke Nishida. 2023. Self-Adaptive Named Entity Recognition by Retrieving Unstructured Knowledge. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 3193–3205, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Self-Adaptive Named Entity Recognition by Retrieving Unstructured Knowledge (Nishida et al., EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.233.pdf
Video:
 https://aclanthology.org/2023.eacl-main.233.mp4