There are no reviews yet. Be the first to send feedback to the community and the maintainers!
Repository Details
C5_W4_A1_Transformer_Subclass_v1 (1). Learning Objectives Create positional encodings to capture sequential relationships in data Calculate scaled dot-product self-attention with word embeddings Implement masked multi-head attention Build and train a Transformer model Fine-tune a pre-trained transformer model for Named Entity Recognition Fine-tune a pre-trained transformer model for Question Answering Implement a QA model in TensorFlow and PyTorch Fine-tune a pre-trained transformer model to a custom dataset Perform extractive Question Answering