There are no reviews yet. Be the first to send feedback to the community and the maintainers!
bookcorpus
Crawl BookCorpusattention_is_all_you_need
Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.bert-chainer
Chainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"dynamic_routing_between_capsules
Implementation of Dynamic Routing Between Capsules, Sara Sabour, Nicholas Frosst, Geoffrey E Hinton, NIPS 2017convolutional_seq2seq
fairseq: Convolutional Sequence to Sequence Learning (Gehring et al. 2017) by Chainerchainer-openai-transformer-lm
A Chainer implementation of OpenAI's finetuned transformer language model with a script to import the weights pre-trained by OpenAIder-network
Dynamic Entity Representation (Kobayashi et al., 2016)variational_dropout_sparsifies_dnn
Variational Dropout Sparsifies Deep Neural Networks (Molchanov et al. 2017) by Chainercaptioning_chainer
A fast implementation of Neural Image Caption by Chainerefficient_softmax
BlackOut and Adaptive Softmax for language models by ChainerROCStory_skipthought_baseline
A novel baseline model for Story Cloze Test and ROCStoriesdynamic_neural_text_model
A Neural Language Model for Dynamically Representing the Meanings of Unknown Words and Entities in a Discourse, Sosuke Kobayashi, Naoaki Okazaki, Kentaro Inui, IJCNLP 2017interval-bound-propagation-chainer
Sven Gowal et al., Scalable Verified Training for Provably Robust Image Classification, ICCV 2019turnover_dropout
learning_to_learn
Learning to learn by gradient descent by gradient descent, Andrychowicz et al., NIPS 2016decode_from_mask
Generate a sentence from a masked sentenceweight_normalization
Weight Normalization (Salimans and Kingma, 2016) by ChainerSDCGAN
Sentence generation by DCGANelmo-chainer
Chainer implementation of contextualized word representations from bi-directional language models. Copied into https://github.com/chainer/models/tree/master/elmo-chaineremergence_of_language_using_discrete_sequences
Emergence of Language Using Discrete Sequencesskip_thought
Language Model and Skip-Thought Vectors (Kiros et al. 2015)vqvae_chainer
Chainer's Neural Discrete Representation Learning (Aaron van den Oord et al., 2017)twitter_conversation_crawler
For crawling conversational tweet threads; e.g. datasets for chatbots.sru_language_model
Language modeling experiments of SRU and variantsrnnlm_chainer
A Fast RNN Language Model by ChainerLove Open Source and this site? Check out how you can help us