Multiple Choice Question Generation with Google T5 and Text2Text (BERT based model)
This application uses the work from Question Generation using Transformers
It is served with FastApi
It's also good to compare the performace of those two models T5 and Text2Text that is BERT based.
Application Details
- Both model take the input text and produces questions and one correct answer for each question.
- For each correct answer, take the this correct and feed the Sense2Vec to get most similar sentences for that correct answer.
- Ignore the highest score similarities to produce the distractors.
- When the answer is a noun refering to person name or entity, the Sense2Vec don't find a candidate and the return is none.
Running
- Download Sense2Vec Pretrained Vectors.
- Extract the content in the root folder. Will be created a folder named s2v_old.
pip install -r requirements.txt
uvicorn app:app --reload --log-level debug
Open your browser http://localhost:8000
Disclaimer
Feel free to fork, improve and share. If you improve, let me know the improvements.