There are no reviews yet. Be the first to send feedback to the community and the maintainers!
ACE
[ACL-IJCNLP 2021] Automated Concatenation of Embeddings for Structured PredictionEcomGPT
An Instruction-tuned Large Language Model for E-commerceHiAGM
Hierarchy-Aware Global Model for Hierarchical Text ClassificationSeqGPT
SeqGPT: An Out-of-the-box Large Language Model for Open Domain Sequence UnderstandingKB-NER
Winner system (DAMO-NLP) of SemEval 2022 MultiCoNER shared task over 10 out of 13 tracks.Multi-CPR
[SIGIR 2022] Multi-CPR: A Multi Domain Chinese Dataset for Passage RetrievalCLNER
[ACL-IJCNLP 2021] Improving Named Entity Recognition by External Context Retrieving and Cooperative LearningMultilangStructureKD
[ACL 2020] Structure-Level Knowledge Distillation For Multilingual Sequence LabelingMuVER
[EMNLP 2021] MuVER: Improving First-Stage Entity Retrieval with Multi-View Entity RepresentationsProtoRE
Code for 'Prototypical Representation Learning for Relation Extraction'.RankingGPT
code for paper 《RankingGPT: Empowering Large Language Models in Text Ranking with Progressive Enhancement》DAAT-CWS
Coupling Distant Annotation and Adversarial Training for Cross-Domain Chinese Word SegmentationAISHELL-NER
[ICASSP 2022] AISHELL-NER: Named Entity Recognition from Chinese SpeechHLATR
Hybrid List Aware Transformer RerankingAIN
Code for our EMNLP 2020 Paper "AIN: Fast and Accurate Sequence Labeling with Approximate Inference Network"EBM-Net
Codes for the EMNLP'2020 paper "Predicting Clinical Trial Results by Implicit Evidence Integration".CDQA
CDQA: Chinese Dynamic Question Answering BenchmarkStructuralKD
[ACL-IJCNLP 2021] Structural Knowledge Distillation: Tractably Distilling Information for Structured PredictorMarCo-Dialog
IBKD
This is the official repository for the IBKD knowledge distillation method, as described in the paper .Vec-RA-ODQA
Source code of paper Improving "Retrieval Augmented Open-Domain Question-Answering with Vectorized ContextsKey-Point-Analysis
Love Open Source and this site? Check out how you can help us