• Stars
    star
    1,252
  • Rank 37,538 (Top 0.8 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created about 1 year ago
  • Updated 3 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Generalist and Lightweight Model for Named Entity Recognition (Extract any entity types from texts) @ NAACL 2024

GLiNER : Generalist and Lightweight model for Named Entity Recognition

GLiNER is a Named Entity Recognition (NER) model capable of identifying any entity type using a bidirectional transformer encoder (BERT-like). It provides a practical alternative to traditional NER models, which are limited to predefined entities, and Large Language Models (LLMs) that, despite their flexibility, are costly and large for resource-constrained scenarios.

Demo Image

Models Status

📢 Updates

  • 📝 Finetuning notebook is available: examples/finetune.ipynb
  • 🗂 Training dataset preprocessing scripts are now available in the data/ directory, covering both Pile-NER 📚 and NuNER 📘 datasets.

Available Models on Hugging Face

To Release

  • ⏳ GLiNER-Multiv2
  • ⏳ GLiNER-Sup (trained on mixture of NER datasets)

Area of improvements / research

  • Allow longer context (eg. train with long context transformers such as Longformer, LED, etc.)
  • Use Bi-encoder (entity encoder and span encoder) allowing precompute entity embeddings
  • Filtering mechanism to reduce number of spans before final classification to save memory and computation when the number entity types is large
  • Improve understanding of more detailed prompts/instruction, eg. "Find the first name of the person in the text"
  • Better loss function: for instance use Focal Loss (see this paper) instead of BCE to handle class imbalance, as some entity types are more frequent than others
  • Improve multi-lingual capabilities: train on more languages, and use multi-lingual training data
  • Decoding: allow a span to have multiple labels, eg: "Cristiano Ronaldo" is both a "person" and "football player"
  • Dynamic thresholding (in model.predict_entities(text, labels, threshold=0.5)): allow the model to predict more entities, or less entities, depending on the context. Actually, the model tend to predict less entities where the entity type or the domain are not well represented in the training data.
  • Train with EMAs (Exponential Moving Averages) or merge multiple checkpoints to improve model robustness (see this paper
  • Extend the model to relation extraction but need dataset with relation annotations. Our preliminary work ATG.

Installation

To use this model, you must install the GLiNER Python library:

!pip install gliner

Usage

Once you've downloaded the GLiNER library, you can import the GLiNER class. You can then load this model using GLiNER.from_pretrained and predict entities with predict_entities.

from gliner import GLiNER

model = GLiNER.from_pretrained("urchade/gliner_base")

text = """
Cristiano Ronaldo dos Santos Aveiro (Portuguese pronunciation: [kɾiʃˈtjɐnu ʁɔˈnaldu]; born 5 February 1985) is a Portuguese professional footballer who plays as a forward for and captains both Saudi Pro League club Al Nassr and the Portugal national team. Widely regarded as one of the greatest players of all time, Ronaldo has won five Ballon d'Or awards,[note 3] a record three UEFA Men's Player of the Year Awards, and four European Golden Shoes, the most by a European player. He has won 33 trophies in his career, including seven league titles, five UEFA Champions Leagues, the UEFA European Championship and the UEFA Nations League. Ronaldo holds the records for most appearances (183), goals (140) and assists (42) in the Champions League, goals in the European Championship (14), international goals (128) and international appearances (205). He is one of the few players to have made over 1,200 professional career appearances, the most by an outfield player, and has scored over 850 official senior career goals for club and country, making him the top goalscorer of all time.
"""

labels = ["person", "award", "date", "competitions", "teams"]

entities = model.predict_entities(text, labels, threshold=0.5)

for entity in entities:
    print(entity["text"], "=>", entity["label"])
Cristiano Ronaldo dos Santos Aveiro => person
5 February 1985 => date
Al Nassr => teams
Portugal national team => teams
Ballon d'Or => award
UEFA Men's Player of the Year Awards => award
European Golden Shoes => award
UEFA Champions Leagues => competitions
UEFA European Championship => competitions
UEFA Nations League => competitions
Champions League => competitions
European Championship => competitions

Named Entity Recognition benchmark result

image/png

Model Authors

The model authors are:

Citation

@misc{zaratiana2023gliner,
      title={GLiNER: Generalist Model for Named Entity Recognition using Bidirectional Transformer}, 
      author={Urchade Zaratiana and Nadi Tomeh and Pierre Holat and Thierry Charnois},
      year={2023},
      eprint={2311.08526},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}

More Repositories

1

GraphER

End-to-end zero-shot entity and relation extraction
Python
50
star
2

ATG

Official code for our paper "An Autoregressive Text-to-Graph Framework for Joint Entity and Relation Extraction" which will be published at AAAI 2024.
Python
40
star
3

graph-neural-nets

Graph neural networks tutorial in pytorch (GCN, GAT, Node2vec, GraphSAge, ClusterGCN, ...)
Jupyter Notebook
23
star
4

EnriCo

End-to-end zero-shot relation extraction
Python
22
star
5

semi-supervised-learning

Semi-supervised learning tutorial in pytorch (pseudio label, Pi model, mean teacher, FixMatch, UDA)
Jupyter Notebook
14
star
6

Filtered-Semi-Markov-CRF

Code for our paper accepted at EMNLP 2023 (Findings)
Python
12
star
7

span-structured-prediction

Repository for my research on span-based structured prediction for information extraction
10
star
8

molgen

Molecule SMILES generation with GAN and Reinforcement Learning (Training Language GAN from scratch)
Jupyter Notebook
9
star
9

GNNer

Code for "GNNer: Reducing Overlapping in Span-based NER Using Graph Neural Networks"
Python
8
star
10

DyREx

DyREx: Dynamic Query Representation for Extractive Question Answering, accepted at ENSLP@NeurIPS2022
Python
7
star
11

Zero-shot-Text-classification

"Test Anywhere: Zero-Shot Learning for Text Classification" paper implementation
Python
6
star
12

HNER

Hierarchical Transformer Model for Scientific Named Entity Recognition
Python
5
star
13

transformer-tutorial

Transformer tutorial in pytorch (Bi-directional, Autoregressive, Sequence-to-sequence)
Jupyter Notebook
5
star
14

struct_ie

Structured information extraction with LLMs
Python
3
star
15

DyREF

DyReF: Extractive Question Answering with Dynamic Query Representation for Free
Python
1
star
16

bert-summarization

Extractive text summarization with BERT
Python
1
star
17

synthetic_attention

Python
1
star
18

global-span-selection

1
star