There are no reviews yet. Be the first to send feedback to the community and the maintainers!
korean-malicious-comments-dataset
한국어 악성댓글 데이터셋CPFT
(unofficial) Code for "Few-Shot Intent Detection via Contrastive Pre-Training and Fine-Tuning", EMNLP 2021RADCoT
Code for "RADCoT: Retrieval-Augmented Distillation to Specialization Models for Generating Chain-of-Thoughts in Query Expansion", LREC-COLING 2024pytorch_lightning_transformer
(unofficial) Code for "Attention is all you need" NIPS 20172021_korean_competition
2021 국립국어원 인공 지능 언어 능력 평가 화성갈끄니까 Team (5th/31)symlink
Code for "JBNU-CCLab at SemEval-2022 Task 12: Machine Reading Comprehension and Span Pair Classification for Linking Mathematical Symbols to Their Descriptions", SemEval@NAACL2022 (1st at the all subtasks) https://aclanthology.org/2022.semeval-1.231/korean_extractive_summarization
korean extractive summarization. 2021 AI 텍스트 요약 온라인 해커톤 화성갈끄니까팀 코드 (1st/50)MAFiD
Code for "MAFiD: Moving Average Equipped Fusion-in-Decoder for Question Answering over Tabular and Textual Data", EACL2023 FindingsBook-Manager-WPF
2019 윈도우즈 프로그래밍 도서관리프로그램(Book Manager) C# WPFgpt2_finetune_essay
write essay using gpt2pytorch_lightning_bert
BERT implementation from scratchGmlp_pretrain
Pretraining code to train Gmlp Language modelNaver-news-article-classification-using-attention-based-bi-lstm-with-pytorch
어텐션 기반 Bi-LSTM을 이용한 한국어 뉴스 분류regression-MLP-by-numpy
선형회귀, 로지스틱 회귀, MLP 구현2021_IR_opendomainqa
2021 정보검색 과제. 한국어 위키피디아 오픈도메인 질의응답 시스템Love Open Source and this site? Check out how you can help us