• Stars
    star
    143
  • Rank 257,007 (Top 6 %)
  • Language
    Python
  • Created about 5 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

MTM

Meta-pretraining Then Meta-learning (MTM) Model for FewShot NLP Tasks

GitHub stars GitHub forks

The source codes of the paper "Improving Few-shot Text Classification via Pretrained Language Representations" and "When Low Resource NLP Meets Unsupervised Language Model: Meta-pretraining Then Meta-learning for Few-shot Text Classification".

If you use the code, pleace cite the following paper:

@inproceedings{deng2020low,
  title={When Low Resource NLP Meets Unsupervised Language Model: Meta-Pretraining then Meta-Learning for Few-Shot Text Classification (Student Abstract).},
  author={Deng, Shumin and Zhang, Ningyu and Sun, Zhanlin and Chen, Jiaoyan and Chen, Huajun},
  booktitle={AAAI},
  pages={13773--13774},
  year={2020}
}