[go: up one dir, main page]

Entity-Enriched Neural Models for Clinical Question Answering

Bhanu Pratap Singh Rawat, Wei-Hung Weng, So Yeon Min, Preethi Raghavan, Peter Szolovits


Abstract
We explore state-of-the-art neural models for question answering on electronic medical records and improve their ability to generalize better on previously unseen (paraphrased) questions at test time. We enable this by learning to predict logical forms as an auxiliary task along with the main task of answer span detection. The predicted logical forms also serve as a rationale for the answer. Further, we also incorporate medical entity information in these models via the ERNIE architecture. We train our models on the large-scale emrQA dataset and observe that our multi-task entity-enriched models generalize to paraphrased questions ~5% better than the baseline BERT model.
Anthology ID:
2020.bionlp-1.12
Volume:
Proceedings of the 19th SIGBioMed Workshop on Biomedical Language Processing
Month:
July
Year:
2020
Address:
Online
Editors:
Dina Demner-Fushman, Kevin Bretonnel Cohen, Sophia Ananiadou, Junichi Tsujii
Venue:
BioNLP
SIG:
SIGBIOMED
Publisher:
Association for Computational Linguistics
Note:
Pages:
112–122
Language:
URL:
https://aclanthology.org/2020.bionlp-1.12
DOI:
10.18653/v1/2020.bionlp-1.12
Bibkey:
Cite (ACL):
Bhanu Pratap Singh Rawat, Wei-Hung Weng, So Yeon Min, Preethi Raghavan, and Peter Szolovits. 2020. Entity-Enriched Neural Models for Clinical Question Answering. In Proceedings of the 19th SIGBioMed Workshop on Biomedical Language Processing, pages 112–122, Online. Association for Computational Linguistics.
Cite (Informal):
Entity-Enriched Neural Models for Clinical Question Answering (Rawat et al., BioNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.bionlp-1.12.pdf
Video:
 http://slideslive.com/38929646
Code
 panushri25/emrQA +  additional community code
Data
SQuADemrQA