• Stars
    star
    205
  • Rank 191,264 (Top 4 %)
  • Language
    Python
  • Created over 4 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Question Answering using Albert and Electra

Question-Answering-Albert-Electra

Question Answering using Albert and Electra using wikipedia text as context.

Description

This repository implements a pipeline to answer questions using wikipedia text. Bellow is the pipeline:

  1. Using the input query, search on google filtering the wikipedia pages.
  2. Read the body content of the wikipedia, preprocess text and split the corpus in paragraphs.
  3. Use BM25 algorithm to rank the best candidate passages, using the top K paragraphs.
  4. Selected paragraphs are used as input to Albert and Electra models.
  5. Both models try to find the answer given the candidate paragraphs.

Running

To predict with Electra, you need to download the pre-trained model from here. Extract the folder and adjust the DATA_MODEL_DIR (line 26) in qa_predict.py to point to the root folder.

Question 1 Question 2 Question 3 BM Scores

More Repositories

1

next_word_prediction

Using transformers to predict next word and predict <mask> word
Python
725
star
2

Bart_T5-summarization

Summarization Task using Bart and T5 models.
HTML
168
star
3

bg-remove-augment

Python
156
star
4

Multiple-Choice-Question-Generation-T5-and-Text2Text

Question Generation using Google T5 and Text2Text
Python
153
star
5

Semantic-Search

Semantic search using Transformers and others
Python
110
star
6

T5-paraphrase-generation

HTML
67
star
7

BERT-cpp-inference

Makefile
52
star
8

GAN-image-inpainting

Deep Learning technics to image inptaining
Python
30
star
9

Deploying-YOLOv5-fastapi-celery-redis-rabbitmq

Python
29
star
10

Switch-Transformers-in-Seq2Seq

Python
22
star
11

go-hexagonal-shortener

Go
12
star
12

Deploying-Deep-Learning-Models-in-C-A-comparison-with-Python-server

Makefile
9
star
13

autograde-deeplearning

This projects aims to implement a auto-grade assistant capable to give a score of "correctness" between a provided answer and a target answer.
HTML
7
star
14

electra-squad-8GB

Fine tuning Electra large model using RTX2080 8 GB on Squad 2 dataset
Python
7
star
15

webapp-StyleGAN2-ADA-PyTorch

Python
6
star
16

deep-reinforcement-learning

Python
6
star
17

sync_async_await_multiprocess_multithread

Python
4
star
18

django_auth_face_recognition

Django web app using face recognition to auth user.
JavaScript
4
star
19

flickr-downloader

Python
3
star
20

bert-nq-python3

Bert-NQ - Google Bert for Natural Question adjusted for Python 3
Python
3
star
21

object_segmentation

Object segmentation using SOTA models
Python
3
star
22

bert_sentiment_flask

Using flask API to predict the sentiment using BERT model
Python
2
star
23

Patches-are-all-you-need-ViT-and-ConvMixer

Python
2
star
24

go-node-python

Go
2
star
25

qa-electra-predict

Predicting with electra model fine tuned in squad 2.0
Python
1
star
26

machine_learning

Exemplos para a disciplina
Python
1
star
27

criptografia

Exemplo de criptografia
HTML
1
star
28

RL-Soft-Actor-Critic_Pytorch

Python
1
star
29

Types-of-Convolution-PyTorch

Python
1
star
30

siamese-neural-net-pytorch

Jupyter Notebook
1
star
31

golang-whatsapp-broker

Whatsapp Broker written in Go
Go
1
star