[go: up one dir, main page]

MWP-BERT: Numeracy-Augmented Pre-training for Math Word Problem Solving

Zhenwen Liang, Jipeng Zhang, Lei Wang, Wei Qin, Yunshi Lan, Jie Shao, Xiangliang Zhang


Abstract
Math word problem (MWP) solving faces a dilemma in number representation learning. In order to avoid the number representation issue and reduce the search space of feasible solutions, existing works striving for MWP solving usually replace real numbers with symbolic placeholders to focus on logic reasoning. However, different from common symbolic reasoning tasks like program synthesis and knowledge graph reasoning, MWP solving has extra requirements in numerical reasoning. In other words, instead of the number value itself, it is the reusable numerical property that matters more in numerical reasoning. Therefore, we argue that injecting numerical properties into symbolic placeholders with contextualized representation learning schema can provide a way out of the dilemma in the number representation issue here. In this work, we introduce this idea to the popular pre-training language model (PLM) techniques and build MWP-BERT, an effective contextual number representation PLM. We demonstrate the effectiveness of our MWP-BERT on MWP solving and several MWP-specific understanding tasks on both English and Chinese benchmarks.
Anthology ID:
2022.findings-naacl.74
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
997–1009
Language:
URL:
https://aclanthology.org/2022.findings-naacl.74
DOI:
10.18653/v1/2022.findings-naacl.74
Bibkey:
Cite (ACL):
Zhenwen Liang, Jipeng Zhang, Lei Wang, Wei Qin, Yunshi Lan, Jie Shao, and Xiangliang Zhang. 2022. MWP-BERT: Numeracy-Augmented Pre-training for Math Word Problem Solving. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 997–1009, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
MWP-BERT: Numeracy-Augmented Pre-training for Math Word Problem Solving (Liang et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.74.pdf
Video:
 https://aclanthology.org/2022.findings-naacl.74.mp4
Code
 lzhenwen/mwp-bert
Data
DROPMath23KMathQA