[go: up one dir, main page]

Code Needs Comments: Enhancing Code LLMs with Comment Augmentation

Demin Song, Honglin Guo, Yunhua Zhou, Shuhao Xing, Yudong Wang, Zifan Song, Wenwei Zhang, Qipeng Guo, Hang Yan, Xipeng Qiu, Dahua Lin


Abstract
The programming skill is one crucial ability for Large Language Models (LLMs), necessitating a deep understanding of programming languages (PLs) and their correlation with natural languages (NLs). We examine the impact of pre-training data on code-focused LLMs’ performance by assessing the comment density as a measure of PL-NL alignment. Given the scarcity of code-comment aligned data in pre-training corpora, we introduce a novel data augmentation method that generates comments for existing code, coupled with a data filtering strategy that filters out code data poorly correlated with natural language. We conducted experiments on three code-focused LLMs and observed consistent improvements in performance on two widely-used programming skill benchmarks. Notably, the model trained on the augmented data outperformed both the model used for generating comments and the model further trained on the data without augmentation.
Anthology ID:
2024.findings-acl.809
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13640–13656
Language:
URL:
https://aclanthology.org/2024.findings-acl.809
DOI:
10.18653/v1/2024.findings-acl.809
Bibkey:
Cite (ACL):
Demin Song, Honglin Guo, Yunhua Zhou, Shuhao Xing, Yudong Wang, Zifan Song, Wenwei Zhang, Qipeng Guo, Hang Yan, Xipeng Qiu, and Dahua Lin. 2024. Code Needs Comments: Enhancing Code LLMs with Comment Augmentation. In Findings of the Association for Computational Linguistics: ACL 2024, pages 13640–13656, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Code Needs Comments: Enhancing Code LLMs with Comment Augmentation (Song et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.809.pdf