[go: up one dir, main page]

Neural text normalization leveraging similarities of strings and sounds

Riku Kawamura, Tatsuya Aoki, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura


Abstract
We propose neural models that can normalize text by considering the similarities of word strings and sounds. We experimentally compared a model that considers the similarities of both word strings and sounds, a model that considers only the similarity of word strings or of sounds, and a model without the similarities as a baseline. Results showed that leveraging the word string similarity succeeded in dealing with misspellings and abbreviations, and taking into account the sound similarity succeeded in dealing with phonetic substitutions and emphasized characters. So that the proposed models achieved higher F1 scores than the baseline.
Anthology ID:
2020.coling-main.192
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2126–2131
Language:
URL:
https://aclanthology.org/2020.coling-main.192
DOI:
10.18653/v1/2020.coling-main.192
Bibkey:
Cite (ACL):
Riku Kawamura, Tatsuya Aoki, Hidetaka Kamigaito, Hiroya Takamura, and Manabu Okumura. 2020. Neural text normalization leveraging similarities of strings and sounds. In Proceedings of the 28th International Conference on Computational Linguistics, pages 2126–2131, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Neural text normalization leveraging similarities of strings and sounds (Kawamura et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.192.pdf