mindspore-nlp-tutorial
mindspore-nlp-tutorial
is a tutorial for who is studying NLP(Natural Language Processing) using MindSpore. This repository is migrated from nlp-tutorial. Most of the models in NLP were migrated from Pytorch version with less than 100 lines of code.(except comments or blank lines)
- Notice: All models are tested on CPU(Linux and macOS), GPU and Ascend.
Curriculum - (Example Purpose)
1. Basic Embedding Model
- 1-1. NNLM(Neural Network Language Model) - Predict Next Word
- 1-2. Word2Vec(Skip-gram) - Embedding Words and Show Graph
2. CNN(Convolutional Neural Network)
- 2-1. TextCNN - Binary Sentiment Classification
3. RNN(Recurrent Neural Network)
- 3-1. TextRNN - Predict Next Step
- Paper - Finding Structure in Time(1990)
- 3-2. TextLSTM - Autocomplete
- Paper - LONG SHORT-TERM MEMORY(1997)
- 3-3. Bi-LSTM - Predict Next Word in Long Sentence
4. Attention Mechanism
- 4-1. Seq2Seq - Change Word
- 4-2. Seq2Seq with Attention - Translate
- 4-3. Bi-LSTM with Attention - Binary Sentiment Classification
5. Model based on Transformer
-
5-1. The Transformer - Translate
- Paper - Attention Is All You Need(2017)
-
5-2. BERT - Classification Next Sentence & Predict Masked Tokens
Dependencies
- Python >= 3.7.5
- MindSpore 1.9.0
- Pytorch 1.7.1(for comparation)
Author
- Yufeng Lyu
- Author Email : [email protected]
- Acknowledgements to graykode who opensource the Pytorch and Tensorflow version.