• Stars
    star
    213
  • Rank 184,334 (Top 4 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created almost 5 years ago
  • Updated almost 2 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Named Entity Recognition with BERT using TensorFlow 2.0

BERT NER

Use google BERT to do CoNLL-2003 NER !

Train model using Python and TensorFlow 2.0

ALBERT-TF2.0

BERT-SQuAD

BERT-NER-Pytorch

Requirements

  • python3
  • pip3 install -r requirements.txt

Download Pretrained Models from Tensorflow offical models

code for pre-trained bert from tensorflow-offical-models

Run

Single GPU

python run_ner.py --data_dir=data/ --bert_model=bert-base-cased --output_dir=out_base --max_seq_length=128 --do_train --num_train_epochs 3 --do_eval --eval_on dev

Multi GPU

python run_ner.py --data_dir=data/ --bert_model=bert-large-cased --output_dir=out_large --max_seq_length=128 --do_train --num_train_epochs 3 --multi_gpu --gpus 0,1,2,3 --do_eval --eval_on test

Result

BERT-BASE

Validation Data

             precision    recall  f1-score   support

        PER     0.9677    0.9756    0.9716      1842
        LOC     0.9671    0.9592    0.9631      1837
       MISC     0.8872    0.9132    0.9001       922
        ORG     0.9191    0.9314    0.9252      1341

avg / total     0.9440    0.9509    0.9474      5942

Test Data

             precision    recall  f1-score   support

        ORG     0.8773    0.9037    0.8903      1661
        PER     0.9646    0.9592    0.9619      1617
       MISC     0.7691    0.8305    0.7986       702
        LOC     0.9333    0.9305    0.9319      1668

avg / total     0.9053    0.9184    0.9117      5648

Pretrained model download from here

BERT-LARGE

Validation Data

             precision    recall  f1-score   support

        ORG     0.9290    0.9374    0.9332      1341
       MISC     0.8967    0.9230    0.9097       922
        PER     0.9713    0.9734    0.9723      1842
        LOC     0.9748    0.9701    0.9724      1837

avg / total     0.9513    0.9564    0.9538      5942

Test Data

             precision    recall  f1-score   support

        LOC     0.9256    0.9329    0.9292      1668
       MISC     0.7891    0.8419    0.8146       702
        PER     0.9647    0.9623    0.9635      1617
        ORG     0.8903    0.9133    0.9016      1661

avg / total     0.9094    0.9242    0.9167      5648

Pretrained model download from here

Inference

from bert import Ner

model = Ner("out_base/")

output = model.predict("Steve went to Paris")

print(output)
'''
    [
        {
            "confidence": 0.9981840252876282,
            "tag": "B-PER",
            "word": "Steve"
        },
        {
            "confidence": 0.9998939037322998,
            "tag": "O",
            "word": "went"
        },
        {
            "confidence": 0.999891996383667,
            "tag": "O",
            "word": "to"
        },
        {
            "confidence": 0.9991968274116516,
            "tag": "B-LOC",
            "word": "Paris"
        }
    ]
'''

Deploy REST-API

BERT NER model deployed as rest api

python api.py

API will be live at 0.0.0.0:8000 endpoint predict

cURL request

curl -X POST http://0.0.0.0:8000/predict -H 'Content-Type: application/json' -d '{ "text": "Steve went to Paris" }'

Output

{
    "result": [
        {
            "confidence": 0.9981840252876282,
            "tag": "B-PER",
            "word": "Steve"
        },
        {
            "confidence": 0.9998939037322998,
            "tag": "O",
            "word": "went"
        },
        {
            "confidence": 0.999891996383667,
            "tag": "O",
            "word": "to"
        },
        {
            "confidence": 0.9991968274116516,
            "tag": "B-LOC",
            "word": "Paris"
        }
    ]
}

cURL

curl output image

Postman

postman output image

Pytorch version

More Repositories

1

BERT-NER

Pytorch-Named-Entity-Recognition-with-BERT
Python
1,199
star
2

BERT-SQuAD

SQuAD Question Answering Using BERT, PyTorch
Python
396
star
3

Named-Entity-Recognition-with-Bidirectional-LSTM-CNNs

Named-Entity-Recognition-with-Bidirectional-LSTM-CNNs
Python
357
star
4

ALBERT-TF2.0

ALBERT model Pretraining and Fine Tuning using TF2.0
Python
199
star
5

stable-diffusion-tritonserver

Deploy stable diffusion model with onnx/tenorrt + tritonserver
Jupyter Notebook
119
star
6

Vision-Transformer

Vision Transformer using TensorFlow 2.0
Python
95
star
7

DATA-SCIENCE-BOWL-2018

DATA-SCIENCE-BOWL-2018 Find the nuclei in divergent images to advance medical discovery
Jupyter Notebook
90
star
8

e5-mistral-7b-instruct

Finetune mistral-7b-instruct for sentence embeddings
Python
65
star
9

minGPT-TF

A minimal TF2 re-implementation of the OpenAI GPT training
Jupyter Notebook
55
star
10

BioELECTRA

BioELECTRA
51
star
11

Swin-Transformer-Serve

Deploy Swin Transformer using TorchServe
Python
26
star
12

TAPAS-TF2

End-to-end neural table-text understanding models.
Python
8
star
13

Malayalam-News-Classifier

Python
7
star
14

BioGPT-HF

Jupyter Notebook
5
star
15

Tapas-Tutorial

Jupyter Notebook
3
star
16

Summarizer

Python
2
star
17

pytorch-tutorial

Jupyter Notebook
2
star
18

Redis-Stack-Bitnami-Helm-Chart

Redis Stack Server Helm Chart
Mustache
1
star
19

librispeech_100_jax

Python
1
star
20

Tensorflow-Paper-Implementation

Python
1
star
21

BioNLP-Corpus

Python
1
star
22

dlrm-jax

Python
1
star
23

S4-Standalone

Python
1
star
24

Multilingual-Complex-Named-Entity-Recognition

Python
1
star
25

git-actions-python

Python
1
star
26

NLI4CT

Jupyter Notebook
1
star
27

BioSimCSE

1
star