[go: up one dir, main page]

Constructing contrastive samples via summarization for text classification with limited annotations

Yangkai Du, Tengfei Ma, Lingfei Wu, Fangli Xu, Xuhong Zhang, Bo Long, Shouling Ji


Abstract
Contrastive Learning has emerged as a powerful representation learning method and facilitates various downstream tasks especially when supervised data is limited. How to construct efficient contrastive samples through data augmentation is key to its success. Unlike vision tasks, the data augmentation method for contrastive learning has not been investigated sufficiently in language tasks. In this paper, we propose a novel approach to construct contrastive samples for language tasks using text summarization. We use these samples for supervised contrastive learning to gain better text representations which greatly benefit text classification tasks with limited annotations. To further improve the method, we mix up samples from different classes and add an extra regularization, named Mixsum, in addition to the cross-entropy-loss. Experiments on real-world text classification datasets (Amazon-5, Yelp-5, AG News, and IMDb) demonstrate the effectiveness of the proposed contrastive learning framework with summarization-based data augmentation and Mixsum regularization.
Anthology ID:
2021.findings-emnlp.118
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1365–1376
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.118
DOI:
10.18653/v1/2021.findings-emnlp.118
Bibkey:
Cite (ACL):
Yangkai Du, Tengfei Ma, Lingfei Wu, Fangli Xu, Xuhong Zhang, Bo Long, and Shouling Ji. 2021. Constructing contrastive samples via summarization for text classification with limited annotations. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 1365–1376, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Constructing contrastive samples via summarization for text classification with limited annotations (Du et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.118.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.118.mp4
Code
 chesterdu/contrastive_summary
Data
AG News