[go: up one dir, main page]

Improving Sentence Embeddings with Automatic Generation of Training Data Using Few-shot Examples

Soma Sato, Hayato Tsukagoshi, Ryohei Sasano, Koichi Takeda


Abstract
Decoder-based large language models (LLMs) have shown high performance on many tasks in natural language processing. This is also true for sentence embedding learning, where a decoder-based model, PromptEOL, has achieved the best performance on semantic textual similarity (STS) tasks. However, PromptEOL requires a manually annotated natural language inference (NLI) dataset for fine-tuning.We aim to improve sentence embeddings without using large manually annotated datasets by automatically generating an NLI dataset with an LLM and using it for fine-tuning of PromptEOL. To achieve this, we explore methods of data generation suitable for sentence embedding learning in this study. Specifically, we will focus on automatic dataset generation through few-shot learning and explore the appropriate methods to leverage few-shot examples. Experimental results on the STS tasks demonstrate that our approach outperforms existing models in settings without large manually annotated datasets.
Anthology ID:
2024.acl-srw.43
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Xiyan Fu, Eve Fleisig
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
378–389
Language:
URL:
https://aclanthology.org/2024.acl-srw.43
DOI:
10.18653/v1/2024.acl-srw.43
Bibkey:
Cite (ACL):
Soma Sato, Hayato Tsukagoshi, Ryohei Sasano, and Koichi Takeda. 2024. Improving Sentence Embeddings with Automatic Generation of Training Data Using Few-shot Examples. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop), pages 378–389, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Improving Sentence Embeddings with Automatic Generation of Training Data Using Few-shot Examples (Sato et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-srw.43.pdf