Attentive Reader
Tensorflow implementation of Google DeepMind's Teaching Machines to Read and Comprehend. This implementation contains:
- Deep LSTM Reader
- with skip connections from the inputs to all hidden layers
- with peephole weights that provide precise timing
- Attentive Reader (in progress)
- with Bidirectional LSTMs with peephole weights
- Impatient Reader (in progress)
- with Bidirectional LSTMs with peephole weights
- with recurrent accumulation of information from the document while reading the query
Prerequisites
- Python 2.7 or Python 3.3+
- Tensorflow
- NLTK
- Gensim
Usage
First, you need to download DeepMind Q&A Dataset from here, save cnn.tgz
and dailymail.tgz
into the repo, and run:
$ ./unzip.sh cnn.tgz dailymail.tgz
Then run the pre-processing code with:
$ python data_utils.py data cnn
To train a model with cnn
dataset:
$ python main.py --dataset cnn
To test an existing model:
$ python main.py --dataset cnn --forward_only True
(in progress)
Results
in progres, should be like:
Author
Taehoon Kim / @carpedm20