• Stars
    star
    4
  • Rank 3,304,323 (Top 66 %)
  • Language
    Python
  • Created almost 7 years ago
  • Updated almost 7 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Encoder-decoder model with attention (Luong), with two LSTM layers with 500 hidden units on both encoder and decoder side. The vocabulary size on both source (english) and target side (Dutch) is 50000. The model is trained on the train part of the TED dataset (https://wit3.fbk.eu/mt.php?release=2017-01-trnmted), maximum sequence length 50.