[go: up one dir, main page]

StyleBART: Decorate Pretrained Model with Style Adapters for Unsupervised Stylistic Headline Generation

Hanqing Wang, Yajing Luo, Boya Xiong, Guanhua Chen, Yun Chen


Abstract
Stylistic headline generation is the task to generate a headline that not only summarizes the content of an article, but also reflects a desired style that attracts users. As style-specific article-headline pairs are scarce, previous researches focus on unsupervised approaches with a standard headline generation dataset and mono-style corpora. In this work, we follow this line and propose StyleBART, an unsupervised approach for stylistic headline generation. Our method decorates the pretrained BART model with adapters that are responsible for different styles and allows the generation of headlines with diverse styles by simply switching the adapters. Different from previous works, StyleBART separates the task of style learning and headline generation, making it possible to freely combine the base model and the style adapters during inference. We further propose an inverse paraphrasing task to enhance the style adapters. Extensive automatic and human evaluations show that StyleBART achieves new state-of-the-art performance in the unsupervised stylistic headline generation task, producing high-quality headlines with the desired style.
Anthology ID:
2023.findings-emnlp.697
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10387–10399
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.697
DOI:
10.18653/v1/2023.findings-emnlp.697
Bibkey:
Cite (ACL):
Hanqing Wang, Yajing Luo, Boya Xiong, Guanhua Chen, and Yun Chen. 2023. StyleBART: Decorate Pretrained Model with Style Adapters for Unsupervised Stylistic Headline Generation. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 10387–10399, Singapore. Association for Computational Linguistics.
Cite (Informal):
StyleBART: Decorate Pretrained Model with Style Adapters for Unsupervised Stylistic Headline Generation (Wang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.697.pdf