Code for the model presented in the paper: "code2seq: Generating Sequences from Structured Representations of Code"
Code for the paper "Language Models are Unsupervised Multitask Learners"
Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation
Specify what you want it to build, the AI asks for clarification, and then builds it.
PDF GPT allows you to chat with the contents of your PDF file by using GPT capabilities. The most effective open source solution to turn your pdf files in a chatbot!
Plug and Play Language Model implementation. Allows to steer topic and attributes of GPT-2 models.
State-of-the-Art Text Embeddings
Transformers for Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Toolkit for Machine Learning, Natural Language Processing, and Text Generation, in TensorFlow. This is part of the CASL project: http://casl-project.ai/
Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"
PyTorch original implementation of Cross-lingual Language Model Pretraining.