Pretrained Transformers for Simple Question Answering over Knowledge Graphs

Lecture Notes in Computer Science(2019)

引用 77|浏览226
暂无评分
摘要
Answering simple questions over knowledge graphs is a well-studied problem in question answering. Previous approaches for this task built on recurrent and convolutional neural network based architectures that use pretrained word embeddings. It was recently shown that fine-tuning pretrained transformer networks (e.g. BERT) can outperform previous approaches on various natural language processing tasks. In this work, we investigate how well BERT performs on SIMPLEQUESTIONS and provide an evaluation of both BERT and BiLSTM-based models in limited-data scenarios.
更多
查看译文
关键词
knowledge
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要