BEM-SM: A BERT-Encoder Model with Symmetry Supervision Module for Solving Math Word Problem.

Yijia Zhang,Tiancheng Zhang, Peng Xie,Minghe Yu,Ge Yu

Symmetry(2023)

引用 0|浏览13
暂无评分
摘要
In order to find solutions to math word problems, some modules have been designed to check the generated expressions, but they neither take into account the symmetry between math word problems and their corresponding mathematical expressions, nor do they utilize the efficiency of pretrained language models in natural language understanding tasks. Anyway, designing fine-tuning tasks for pretrained language models that encourage cooperation with other modules to improve the performance of math word problem solvers is an unaddressed problem. To solve these problems, in this paper we propose a BERT-based model for solving math word problems with a supervision module. Based on pretrained language models, we present a fine-tuning task to predict the number of different operators in the expressions to learn the potential relationships between the problems and the expressions. Meanwhile, a supervision module is designed to check the incorrect expressions generated and improve the model's performance by optimizing the encoder. A series of experiments are conducted on three datasets, and the experimental results demonstrate the effectiveness of our model and its component's designs.
更多
查看译文
关键词
math word problems, natural language processing, pre-trained models
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要