Quickly train T5/mT5/byT5/CodeT5 models in just 3 lines of code
simpleT5 is built on top of PyTorch-lightning⚡️ and Transformers🤗 that lets you quickly train your T5 models.
T5 models can be used for several NLP tasks such as summarization, QA , QG , translation , text generation, and more.
Here's a link to Medium article along with an example colab notebook
Install
# It's advisable to create a new python environment and install simplet5
pip install --upgrade simplet5
Usage
simpleT5 for summarization task
# import
from simplet5 import SimpleT5
# instantiate
model = SimpleT5()
# load (supports t5, mt5, byT5 and CodeT5 models)
model.from_pretrained("t5","t5-base")
# train
model.train(train_df=train_df, # pandas dataframe with 2 columns: source_text & target_text
eval_df=eval_df, # pandas dataframe with 2 columns: source_text & target_text
source_max_token_len = 512,
target_max_token_len = 128,
batch_size = 8,
max_epochs = 5,
use_gpu = True,
outputdir = "outputs",
early_stopping_patience_epochs = 0,
precision = 32
)
# load trained T5 model
model.load_model("t5","path/to/trained/model/directory", use_gpu=False)
# predict
model.predict("input text for prediction")
Articles
- Geek Culture: simpleT5 — Train T5 Models in Just 3 Lines of Code
- Abstractive Summarization with SimpleT5⚡️
- Training T5 model in just 3 lines of Code with ONNX Inference
- Kaggle: simpleT5⚡️ - Generating one line summary of papers
- Youtube: Abstractive Summarization Demo with SimpleT5