[go: up one dir, main page]

Dynamic Structured Neural Topic Model with Self-Attention Mechanism

Nozomu Miyamoto, Masaru Isonuma, Sho Takase, Junichiro Mori, Ichiro Sakata


Abstract
This study presents a dynamic structured neural topic model, which can handle the time-series development of topics while capturing their dependencies. Our model captures the topic branching and merging processes by modeling topic dependencies based on a self-attention mechanism. Additionally, we introduce citation regularization, which induces attention weights to represent citation relations by modeling text and citations jointly. Our model outperforms a prior dynamic embedded topic model regarding perplexity and coherence, while maintaining sufficient diversity across topics. Furthermore, we confirm that our model can potentially predict emerging topics from academic literature.
Anthology ID:
2023.findings-acl.366
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5916–5930
Language:
URL:
https://aclanthology.org/2023.findings-acl.366
DOI:
10.18653/v1/2023.findings-acl.366
Bibkey:
Cite (ACL):
Nozomu Miyamoto, Masaru Isonuma, Sho Takase, Junichiro Mori, and Ichiro Sakata. 2023. Dynamic Structured Neural Topic Model with Self-Attention Mechanism. In Findings of the Association for Computational Linguistics: ACL 2023, pages 5916–5930, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Dynamic Structured Neural Topic Model with Self-Attention Mechanism (Miyamoto et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.366.pdf