• Stars
    star
    144
  • Rank 247,791 (Top 6 %)
  • Language
    Python
  • Created over 6 years ago
  • Updated almost 2 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

TensorFlow implementation of Conversation Models

A Neural Conversational Model hb-research

TensorFlow implementation of Conversation Models.

  1. Model

    • seq2seq_attention : Seq2Seq model with attentional decoder
  2. Dataset

Requirements

Project Structure

initiate Project by hb-base

.
β”œβ”€β”€ config                  # Config files (.yml, .json) using with hb-config
β”œβ”€β”€ data/                   # dataset path
β”œβ”€β”€ scripts                 # download dataset using shell scripts
β”œβ”€β”€ seq2seq_attention       # seq2seq_attention architecture graphs (from input to logits)
    β”œβ”€β”€ __init__.py             # Graph
    β”œβ”€β”€ encoder.py              # Encoder
    β”œβ”€β”€ decoder.py              # Decoder
β”œβ”€β”€ data_loader.py          # raw_date -> precossed_data -> generate_batch (using Dataset)
β”œβ”€β”€ hook.py                 # training or test hook feature (eg. print_variables)
β”œβ”€β”€ main.py                 # define experiment_fn
└── model.py                # define EstimatorSpec      

Reference : hb-config, Dataset, experiments_fn, EstimatorSpec

Todo

Config

Can control all Experimental environment.

example: cornell-movie-dialogs.yml

data:
  base_path: 'data/cornell_movie_dialogs_corpus/'
  conversation_fname: 'movie_conversations.txt'
  line_fname: 'movie_lines.txt'
  processed_path: 'processed_cornell_movie_dialogs_data'
  word_threshold: 2
  max_seq_length: 200
  sentence_diff: 0.33   # (Filtering with input and output sentence diff)
  testset_size: 25000

  PAD_ID: 0
  UNK_ID: 1
  START_ID: 2
  EOS_ID: 3

model:
  batch_size: 32
  num_layers: 4
  num_units: 512
  embed_dim: 256
  embed_share: true   # (true or false)
  cell_type: gru      # (lstm, gru, layer_norm_lstm, nas)
  dropout: 0.2
  encoder_type: bi    # (uni / bi)
  attention_mechanism: normed_bahdanau  # (bahdanau, normed_bahdanau, luong, scaled_luong)

train:
  learning_rate: 0.001
  sampling_probability: 0.25  # (Scheduled Sampling)
  
  train_steps: 100000
  model_dir: 'logs/cornell_movie_dialogs'
  
  save_checkpoints_steps: 1000
  loss_hook_n_iter: 1000
  check_hook_n_iter: 1000
  min_eval_frequency: 1000
  
  print_verbose: True
  debug: False

predict:
  beam_width: 5    # (0: GreedyEmbeddingHelper, 1>=: BeamSearchDecoder)
  length_penalty_weight: 1.0
  
slack:
  webhook_url: ""  # after training notify you using slack-webhook

Usage

Install requirements.

pip install -r requirements.txt

First, check if the model is valid.

python main.py --config check_tiny --mode train

Then, download Cornell_Movie-Dialogs_Corpus and train it.

sh scripts/prepare_Cornell_Movie-Dialogs_Corpus.sh
python data_loader --config cornell-movie-dialogs  # pre-processing data
python main.py --config cornell-movie-dialogs --mode train_and_evaluate

Experiments modes

βœ… : Working
β—½ : Not tested yet.

  • βœ… evaluate : Evaluate on the evaluation data.
  • β—½ extend_train_hooks : Extends the hooks for training.
  • β—½ reset_export_strategies : Resets the export strategies with the new_export_strategies.
  • β—½ run_std_server : Starts a TensorFlow server and joins the serving thread.
  • β—½ test : Tests training, evaluating and exporting the estimator for a single step.
  • βœ… train : Fit the estimator using the training data.
  • βœ… train_and_evaluate : Interleaves training and evaluation.

After training, start chatting.

python chat.py --config cornell-movie-dialogs

Tensorboard

tensorboard --logdir logs

Conversation Example

> (Human)

> hi.
i ' m sorry . <\s>
> why?
i ' m not interested .
> hmm.. are you human?
nnnnn . <\s>
> you are you?
i ' m not going to marry you . <\s>
> what? me also!
i ' m sorry . <\s>
> anyway, what is your name?
i ' m not going to kill you . <\s>
> i'm asking your name.
i ' m sorry . <\s>
> yeah, so.. where are you from?
i ' m not sure . <\s>
> Have you lost your memory?
i ' m not sure i ' m not going to you . <\s>
> Have you ever been to Florida?
i ' m not sure i ' m not a little . <\s>
> there there.
no , no , no , no ,
> Are you alright?
i ' m not going to die . <\s>
> alsoulutely, you are very healthy.
i don ' t know . <\s>
> you know that? you are robot.
i ' m not sure . <\s>
>  i don ' t know what to say.
i ' m not sure i ' m not a little stupid . <\s>
> what do you like to do in your free time?
i ' m not sure i ' m not gonna take you a little more . <\s>
> are you in love?
i ' m not sure i ' m not stupid . <\s>
> i know.
i ' m not going to let you
> what's your name?
i ' m not sure i ' m not a good guy . <\s>
> oh god..
i ' m not going to do that . <\s>
> let's talk about something else.
i ' m not sure . <\s>
> i have to figure out what is the problem.
i ' m not sure . <\s>
> because of you are stupid.
i ' m not going to get you . <\s>
> florida?
he is a homosexual - -
> really?
i ' m not sure i
> can you tell me about more detail about him?
i ' m not a cop . <\s>
> okay.
i ' m not going to
> you always say 'i....'
i ' m not sure i ' m not a murderer . <\s>
> hmm.....
i ' m not sure i ' m not a little popular . <\s>
> you are not popular.
i ' m not sure i ' m not a girl .
> are you boy?
i ' m not sure . <\s>
> are you girl?
i ' m not sure . <\s>
> who are you?
i ' m sorry . <\s>
> i'm sorry.
i ' m not crowding . <\s>
> just kidding.
i ' m going to be a writer
> great!
i don ' t know .
> what do you mean?
i ' m sorry , i ' m sorry . <\s>
> omg
no . <\s>
> florida.
he hasn ' t got any
> music is my life.
you ' re a fool , you ' re a man !
> wow!
i ' m not kidding .
> sorry.
i ' m not kidding .
> yeah, i know it.
i ' m not sure . <\s>
> california?
what ? <\s>
> haha.
it ' s a real competition

Reference

Author

Dongjun Lee ([email protected])

Contributors

More Repositories

1

transformer-tensorflow

TensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Python
349
star
2

text-cnn-tensorflow

Convolutional Neural Networks for Sentence Classification(TextCNN) implements by TensorFlow
Python
248
star
3

quantified-self

Self-knowledge through numbers
Python
143
star
4

char-rnn-tensorflow

Multi-layer Recurrent Neural Networks for character-level language models implements by TensorFlow
Python
60
star
5

notes

The notes for Math, Machine Learning, Deep Learning and Research papers.
Python
52
star
6

dmn-tensorflow

TensorFlow implementation of 'Ask Me Anything: Dynamic Memory Networks for Natural Language Processing (2015)'
Python
41
star
7

awesome-feeds

A curated list of tech, machine learning, biz and etc... feeds
28
star
8

dqn-tensorflow

Deep Q Network implements by Tensorflow
Python
25
star
9

hb-config

hb-config: easy to configure your python project especially Deep Learning experiments
Python
21
star
10

hb-base

Project structure of Deep Learning experiments
Python
13
star
11

BeAwesomeToday

Be Awesome Today - My Awesome List & Today I Learned & Blogging Articles
13
star
12

DeepLearning-Notebooks

Deep Learning Notebooks Implements by TensorFlow, Python + numpy
Jupyter Notebook
12
star
13

vae-tensorflow

TensorFlow implementation of Auto-Encoding Variational Bayes.
Python
8
star
14

gan-pytorch

PyTorch implementation of 'GAN (Generative Adversarial Networks)'
Python
7
star
15

DataScience-Notebooks

Collection of Data Science Notebooks
Jupyter Notebook
7
star
16

hb-nvim

The ultimate nvim distribution
Vim Script
6
star
17

kino-webhook

Serverless webhook handler
Python
2
star
18

SaladyBot

Slack Bot for Salady
Python
2
star
19

relation-network-tensorflow

TensorFlow implementation of 'A simple neural network module for relational reasoning' for bAbi task.
Python
2
star
20

PEP8_kor

PEP 8 -- Style Guide for Python Code (Korean)
2
star
21

DongjunLee

2
star
22

BeHappy-Django

Quantified Self Project for Happiness, Efficiency, Activity
CSS
2
star
23

bi-att-flow-tensorflow

In Progress...
Python
2
star
24

Effective_Python_Notes

Summary of Effective Python and example notebooks
1
star