[go: up one dir, main page]

Task-oriented Word Embedding for Text Classification

Qian Liu, Heyan Huang, Yang Gao, Xiaochi Wei, Yuxin Tian, Luyang Liu


Abstract
Distributed word representation plays a pivotal role in various natural language processing tasks. In spite of its success, most existing methods only consider contextual information, which is suboptimal when used in various tasks due to a lack of task-specific features. The rational word embeddings should have the ability to capture both the semantic features and task-specific features of words. In this paper, we propose a task-oriented word embedding method and apply it to the text classification task. With the function-aware component, our method regularizes the distribution of words to enable the embedding space to have a clear classification boundary. We evaluate our method using five text classification datasets. The experiment results show that our method significantly outperforms the state-of-the-art methods.
Anthology ID:
C18-1172
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2023–2032
Language:
URL:
https://aclanthology.org/C18-1172
DOI:
Bibkey:
Cite (ACL):
Qian Liu, Heyan Huang, Yang Gao, Xiaochi Wei, Yuxin Tian, and Luyang Liu. 2018. Task-oriented Word Embedding for Text Classification. In Proceedings of the 27th International Conference on Computational Linguistics, pages 2023–2032, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Task-oriented Word Embedding for Text Classification (Liu et al., COLING 2018)
Copy Citation:
PDF:
https://aclanthology.org/C18-1172.pdf
Code
 qianliu0708/ToWE
Data
AG NewsIMDb Movie ReviewsSSTSST-2