[go: up one dir, main page]

Refining Idioms Semantics Comprehension via Contrastive Learning and Cross-Attention

Mingmin Wu, Guixin Su, Yongcheng Zhang, Zhongqiang Huang, Ying Sha


Abstract
Chinese idioms on social media demand a nuanced understanding for correct usage. The Chinese idiom cloze test poses a unique challenge for machine reading comprehension due to the figurative meanings of idioms deviating from their literal interpretations, resulting in a semantic bias in models’ comprehension of idioms. Furthermore, given that the figurative meanings of many idioms are similar, their use as suboptimal options can interfere with optimal selection. Despite achieving some success in the Chinese idiom cloze test, existing methods based on deep learning still struggle to comprehensively grasp idiom semantics due to the aforementioned issues. To tackle these challenges, we introduce a Refining Idioms Semantics Comprehension Framework (RISCF) to capture the comprehensive idioms semantics. Specifically, we propose a semantic sense contrastive learning module to enhance the representation of idiom semantics, diminishing the semantic bias between figurative and literal meanings of idioms. Meanwhile, we propose an interference-resistant cross-attention module to attenuate the interference of suboptimal options, which considers the interaction between the candidate idioms and the blank space in the context. Experimental results on the benchmark datasets demonstrate the effectiveness of our RISCF model, which outperforms state-of-the-art methods significantly.
Anthology ID:
2024.lrec-main.1203
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
13785–13795
Language:
URL:
https://aclanthology.org/2024.lrec-main.1203
DOI:
Bibkey:
Cite (ACL):
Mingmin Wu, Guixin Su, Yongcheng Zhang, Zhongqiang Huang, and Ying Sha. 2024. Refining Idioms Semantics Comprehension via Contrastive Learning and Cross-Attention. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 13785–13795, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Refining Idioms Semantics Comprehension via Contrastive Learning and Cross-Attention (Wu et al., LREC-COLING 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.lrec-main.1203.pdf