• Stars
    star
    251
  • Rank 155,844 (Top 4 %)
  • Language
    Python
  • Created over 7 years ago
  • Updated almost 5 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Word Prediction using Convolutional Neural Networks

Word Prediction using Convolutional Neural Networks—can you do better than iPhone™ Keyboard?

In this project, we examine how well neural networks can predict the current or next word. Language modeling is one of the most important nlp tasks, and you can easily find deep learning approaches to it. Our contribution is threefold. First, we want to make a model that simulates a mobile environment, rather than having general modeling purposes. Therefore, instead of assessing perplexity, we try to save the keystrokes that the user need to type. To this end, we manually typed 64 English paragraphs with a iPhone 7 for comparison. It was super boring, but hopefully it will be useful for others. Next, we use CNNs instead of RNNs, which are more widely used in language modeling tasks. RNNs—even improved types such as LSTM or GRU—suffer from short term memory. Deep layers of CNNs are expected to overcome the limitation. Finally, we employ a character-to-word model here. Concretely, we predict the current or next word, seeing the preceding 50 characters. Because we need to make a prediction at every time step of typing, the word-to-word model dont't fit well. And the char-to-char model has limitations in that it depends on the autoregressive assumption. Our current belief is the character-to-word model is best for this task. Although our relatively simple model is still behind a few steps iPhone 7 Keyboard, we observed its potential.

Requirements

  • numpy >= 1.11.1
  • sugartensor >= 0.0.2.4
  • lxml >= 3.6.4.
  • nltk >= 3.2.1.
  • regex

Background / Glossary / Metric

  • Most smartphone keyboards offer a word prediction option to save the user's typing. If you turn the option on, you can see suggested words on the top of the keyboard area. In iPhone, the leftmost one is verbatim, the middle one is appeared the top candidate.

  • Full Keystrokes (FK): the keystrokes when supposing that the user has deactivated the prediction option. In this exeriment, the number of FK is the same as the number of characters (including spaces).

  • Responsive Keystroke (RK): the keystrokes when supposing that so the user always choose it if their intended word is suggested. Especially, we take only the top candidate into consideration here.

  • Keystroke Savings Rate (KSR): the rate of savings by a predictive engine. It is simply calculated as follows.

    • KSR = (FK - RK) / FK

Data

  • For training and test, we build an English news corpus from wikinews dumps for the last 6 months.

Model Architecture / Hyper-parameters

  • 20 * conv layer with kernel size=5, dimensions=300
  • residual connection

Work Flow

  • STEP 1. Download English wikinews dumps.
  • STEP 2. Extract them and copy the xml files to data/raw folder.
  • STEP 3. Run build_corpus.py to build an English news corpus.
  • STEP 4. Run prepro.py to make vocabulary and training/test data.
  • STEP 5. Run train.py.
  • STEP 6. Run eval.py to get the results for the test sentences.
  • STEP 7. We manually tested for the same test sentences with iPhone 7 keyboard.

if you want to use the pretrained model,

Updates

  • In the fourth week of Feb., 2017, we refactored the source file for TensorFlow 1.0.
  • In addition, we changed the last global-average pooling to inverse-weighted pooling. As a result, the #KSR improved from 0.39 to 0.42. Check this.

Results

The training took 4-5 2-3 days on my single GPU (gtx 1060). As can be seen below, our model is lower than iPhone in KSR by 8 5 percent points. Details are available in results.csv.

| #FK | #RK: Ours | #RK: iPhone 7 | |--- |--- |--- |--- |--- | | 40,787 | 24,727 (=0.39 ksr)
->23,753 (=0.42 ksr) | 21,535 (=0.47 ksr)|

Conclusions

  • Unfortunately, our simple model failed to show better performance than the iPhone predictive engine.
  • Keep in mind that in practice predictive engines make use of other information such as user history.
  • There is still much room for improvement. Here are some ideas.
    • You can refine the model architecture or hyperparameters.
    • As always, bigger data is better.
  • Can anybody implement a traditional n-gram model for comparison?

Cited By

  • Zhe Zeng & Matthias Roetting, A Text Entry Interface using Smooth Pursuit Movements and Language Model, Proceeding ETRA '18 Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, 2018

More Repositories

1

transformer

A TensorFlow Implementation of the Transformer: Attention Is All You Need
Python
4,126
star
2

nlp_tasks

Natural Language Processing Tasks and References
3,018
star
3

wordvectors

Pre-trained word vectors of 30+ languages
Python
2,199
star
4

tacotron

A TensorFlow Implementation of Tacotron: A Fully End-to-End Text-To-Speech Synthesis Model
Python
1,818
star
5

numpy_exercises

Numpy exercises.
Python
1,672
star
6

dc_tts

A TensorFlow Implementation of DC-TTS: yet another text-to-speech model
Python
1,147
star
7

sudoku

Can Neural Networks Crack Sudoku?
Python
821
star
8

g2p

g2p: English Grapheme To Phoneme Conversion
Python
734
star
9

tensorflow-exercises

TensorFlow Exercises - focusing on the comparison with NumPy.
Python
535
star
10

deepvoice3

Tensorflow Implementation of Deep Voice 3
Python
452
star
11

css10

CSS10: A Collection of Single Speaker Speech Datasets for 10 Languages
HTML
440
star
12

neural_chinese_transliterator

Can CNNs transliterate Pinyin into Chinese characters correctly?
Python
330
star
13

pytorch_exercises

Jupyter Notebook
312
star
14

bert_ner

Ner with Bert
Python
278
star
15

nlp_made_easy

Explains nlp building blocks in a simple manner.
Jupyter Notebook
247
star
16

g2pC

g2pC: A Context-aware Grapheme-to-Phoneme Conversion module for Chinese
Python
231
star
17

g2pK

g2pK: g2p module for Korean
Python
216
star
18

expressive_tacotron

Tensorflow Implementation of Expressive Tacotron
Python
196
star
19

speaker_adapted_tts

Making a TTS model with 1 minute of speech samples within 10 minutes
184
star
20

neural_japanese_transliterator

Can neural networks transliterate Romaji into Japanese correctly?
Python
173
star
21

tacotron_asr

Speech Recognition Using Tacotron
Python
165
star
22

quasi-rnn

Character-level Neural Translation using Quasi-RNNs
Python
134
star
23

label_smoothing

Corrupted labels and label smoothing
Jupyter Notebook
127
star
24

bert-token-embeddings

Jupyter Notebook
97
star
25

mtp

Multi-lingual Text Processing
95
star
26

cross_vc

Cross-lingual Voice Conversion
Python
94
star
27

name2nat

name2nat: a Python package for nationality prediction from a name
Python
89
star
28

pron_dictionaries

pronunciation dictionaries for multiple languages
Python
79
star
29

msg_reply

a simple message reply suggestion system
Python
78
star
30

word_ordering

Can neural networks order a scramble of words correctly?
Python
74
star
31

kss

Python
70
star
32

neural_tokenizer

Tokenize English sentences using neural networks.
Python
64
star
33

bytenet_translation

A TensorFlow Implementation of Machine Translation In Neural Machine Translation in Linear Time
Python
60
star
34

KoParadigm

KoParadigm: Korean Inflectional Paradigm Generator
Python
54
star
35

specAugment

Tensor2tensor experiment with SpecAugment
Python
46
star
36

vq-vae

A Tensorflow Implementation of VQ-VAE Speaker Conversion
Python
43
star
37

lm_finetuning

Language Model Fine-tuning for Moby Dick
Python
42
star
38

texture_generation

An Implementation of 'Texture Synthesis Using Convolutional Neural Networks' with Kylberg Texture Dataset
Python
33
star
39

integer_sequence_learning

RNN Approaches to Integer Sequence Learning--the famous Kaggle competition
Python
27
star
40

cjk_trans

Pre-trained Machine Translation Models of Korean from/to ECJ
27
star
41

h2h_converter

Convert Sino-Korean words written in Hangul to Chinese characters, which is called hanja in Korean, using neural networks
Python
25
star
42

up_and_running_with_Tensorflow

A simple tutorial of TensorFlow + TensorFlow / NumPy exercises
Jupyter Notebook
13
star
43

neurobind

Yet Another Model Using Neural Networks for Predicting Binding Preferences of for Test DNA Sequences
Python
11
star
44

kollocate

Collocation Search of Korean
Python
9
star
45

kyubyong

9
star
46

WhereAmI

Where Am I? - If you want to meet me.
5
star
47

spam_detection

Spam Dectection Under Semi-supervised settings
5
star
48

helo_word

A Neural Grammatical Error Correction System Built On Better Pre-training and Sequential Transfer Learning
Python
2
star