One Semantic Parser To Parse Them All: Sequence To Sequence Multi-Task Learning On Semantic Parsing Datasets

10TH CONFERENCE ON LEXICAL AND COMPUTATIONAL SEMANTICS (SEM 2021)(2021)

引用 0|浏览8
暂无评分
摘要
Semantic parsers map natural language utterances to meaning representations. The lack of a single standard for meaning representations led to the creation of a plethora of semantic parsing datasets. To unify different datasets and train a single model for them, we investigate the use of Multi-Task Learning (MTL) architectures. We experiment with five datasets (GEOQUERY, NLMAPS, TOP, OVERNIGHT, AMR). We find that an MTL architecture that shares the entire network across datasets yields competitive or better parsing accuracies than the single-task baselines, while reducing the total number of parameters by 68%. We further provide evidence that MTL has also better compositional generalization than singletask models. We also present a comparison of task sampling methods and propose a competitive alternative to widespread proportional sampling strategies.
更多
查看译文
关键词
semantic parser,learning,sequence,multi-task
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要