[go: up one dir, main page]

Pcc-tuning: Breaking the Contrastive Learning Ceiling in Semantic Textual Similarity

Bowen Zhang, Chunping Li


Abstract
Semantic Textual Similarity (STS) constitutes a critical research direction in computational linguistics and serves as a key indicator of the encoding capabilities of embedding models. Driven by advances in pre-trained language models and contrastive learning, leading sentence representation methods have reached an average Spearman’s correlation score of approximately 86 across seven STS benchmarks in SentEval. However, further progress has become increasingly marginal, with no existing method attaining an average score higher than 86.5 on these tasks. This paper conducts an in-depth analysis of this phenomenon and concludes that the upper limit for Spearman’s correlation scores under contrastive learning is 87.5. To transcend this ceiling, we propose an innovative approach termed Pcc-tuning, which employs Pearson’s correlation coefficient as a loss function to refine model performance beyond contrastive learning. Experimental results demonstrate that Pcc-tuning can markedly surpass previous state-of-the-art strategies with only a minimal amount of fine-grained annotated samples.
Anthology ID:
2024.emnlp-main.791
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14290–14302
Language:
URL:
https://aclanthology.org/2024.emnlp-main.791
DOI:
10.18653/v1/2024.emnlp-main.791
Bibkey:
Cite (ACL):
Bowen Zhang and Chunping Li. 2024. Pcc-tuning: Breaking the Contrastive Learning Ceiling in Semantic Textual Similarity. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 14290–14302, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Pcc-tuning: Breaking the Contrastive Learning Ceiling in Semantic Textual Similarity (Zhang & Li, EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.791.pdf
Software:
 2024.emnlp-main.791.software.zip
Data:
 2024.emnlp-main.791.data.zip