There are no reviews yet. Be the first to send feedback to the community and the maintainers!
Repository Details
This repository contains the codes for pre-training a BERT-base model on a large, un-annotated corpus of text using dynamic Masked Language Modeling (MLM) and dynamic Next Sentence Prediction (NSP).