TopiQAL: Topic-aware Question Answering using Scalable Domain-specific Supercomputers

2020 IEEE/ACM Fourth Workshop on Deep Learning on Supercomputers (DLS)(2020)

引用 4|浏览10
暂无评分
摘要
We all have questions. About today's temperature, scores of our favorite baseball team, the Universe, and about vaccine for COVID-19. Life, physical, and natural scientists have been trying to find answers to various topics using scientific methods and experiments, while computer scientists have built language models as a tiny step towards automatically answering all of these questions across domains given a little bit of context. In this paper, we propose an architecture using state-of-the-art Natural Language Processing language models namely Topic Models and Bidirectional Encoder Representations from Transformers (BERT) that can transparently and automatically retrieve articles of relevance to questions across domains, and fetch answers to topical questions related to COVID-19 current and historical medical research literature. We demonstrate the benefits of using domain-specific supercomputers like Tensor Processing Units (TPUs), residing on cloud-based infrastructure, using which we could achieve significant gains in training and inference times, also with very minimal cost.
更多
查看译文
关键词
scientific methods,computer scientists,tiny step,state-of-the-art Natural Language Processing language models,Bidirectional Encoder Representations,topical questions,COVID-19 current research literature,historical medical research literature,TopiQAL,Topic-aware question,scalable domain-specific supercomputers,favorite baseball team,natural scientists
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要