A Neural Ranker for Open-Domain Question Answering via Sentence-Level Jump Encoding

2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)(2021)

引用 1|浏览26
暂无评分
摘要
Open-domain question answering extract the answer for a question from a large scale corpus, which typically employs ranker to filter irrelevant paragraphs. As traditional retrieval approaches based on statistical characteristics, neural ranking model has gradually become a popular choice for filtering as its capacity of semantic features extraction. But massive corpus request high performance for filtering efficiency. Thus, acceleration of textual encoding by neural model is worthy of attention and there are many achievements on various of perspectives. Inspired by reading habit of human, this paper presents a novel neural ranker on sentence-level jump encoding which can better skip those irrelevant sentences than existing neural models. When people browsing text with questions, they tend to find the key sentence based on heuristic information of question and current reading. And our sentence-level jump encoder works in a similar way which can effectively and accurately locate key sentences for question. We evaluate our neural ranker on long paragraph setting of Quasar-T dataset, our method achieves significant improvements on both ranking acceleration and ranking accuracy, as well as surpass other question answering baselines on the whole question answering results.
更多
查看译文
关键词
question answering, deep learning, reinforcement learning, paragraph ranking
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要