Gated-Attention Readers for Text Comprehension

ACL, 2017.

Cited by: 238|Bibtex|Views91|Links
EI

Abstract:

In this paper we study the problem of answering cloze-style questions over documents. Our model, the Gated-Attention (GA) Reader, integrates a multi-hop architecture with a novel attention mechanism, which is based on multiplicative interactions between the query embedding and the intermediate states of a recurrent neural network document...More

Code:

Data:

Your rating :
0

 

Tags
Comments