[go: up one dir, main page]

Semi-Supervised Semantic Dependency Parsing Using CRF Autoencoders

Zixia Jia, Youmi Ma, Jiong Cai, Kewei Tu


Abstract
Semantic dependency parsing, which aims to find rich bi-lexical relationships, allows words to have multiple dependency heads, resulting in graph-structured representations. We propose an approach to semi-supervised learning of semantic dependency parsers based on the CRF autoencoder framework. Our encoder is a discriminative neural semantic dependency parser that predicts the latent parse graph of the input sentence. Our decoder is a generative neural model that reconstructs the input sentence conditioned on the latent parse graph. Our model is arc-factored and therefore parsing and learning are both tractable. Experiments show our model achieves significant and consistent improvement over the supervised baseline.
Anthology ID:
2020.acl-main.607
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6795–6805
Language:
URL:
https://aclanthology.org/2020.acl-main.607
DOI:
10.18653/v1/2020.acl-main.607
Bibkey:
Cite (ACL):
Zixia Jia, Youmi Ma, Jiong Cai, and Kewei Tu. 2020. Semi-Supervised Semantic Dependency Parsing Using CRF Autoencoders. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 6795–6805, Online. Association for Computational Linguistics.
Cite (Informal):
Semi-Supervised Semantic Dependency Parsing Using CRF Autoencoders (Jia et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.607.pdf
Video:
 http://slideslive.com/38929248