Meta-pretraining Then Meta-learning (MTM) Model for FewShot NLP Tasks
The source codes of the paper "Improving Few-shot Text Classification via Pretrained Language Representations" and "When Low Resource NLP Meets Unsupervised Language Model: Meta-pretraining Then Meta-learning for Few-shot Text Classification".
If you use the code, pleace cite the following paper:
@inproceedings{deng2020low,
title={When Low Resource NLP Meets Unsupervised Language Model: Meta-Pretraining then Meta-Learning for Few-Shot Text Classification (Student Abstract).},
author={Deng, Shumin and Zhang, Ningyu and Sun, Zhanlin and Chen, Jiaoyan and Chen, Huajun},
booktitle={AAAI},
pages={13773--13774},
year={2020}
}