There are no reviews yet. Be the first to send feedback to the community and the maintainers!
korean-malicious-comments-dataset
ํ๊ตญ์ด ์ ์ฑ๋๊ธ ๋ฐ์ดํฐ์ CPFT
(unofficial) Code for "Few-Shot Intent Detection via Contrastive Pre-Training and Fine-Tuning", EMNLP 2021RADCoT
Code for "RADCoT: Retrieval-Augmented Distillation to Specialization Models for Generating Chain-of-Thoughts in Query Expansion", LREC-COLING 2024pytorch_lightning_transformer
(unofficial) Code for "Attention is all you need" NIPS 20172021_korean_competition
2021 ๊ตญ๋ฆฝ๊ตญ์ด์ ์ธ๊ณต ์ง๋ฅ ์ธ์ด ๋ฅ๋ ฅ ํ๊ฐ ํ์ฑ๊ฐ๋๋๊น Team (5th/31)symlink
Code for "JBNU-CCLab at SemEval-2022 Task 12: Machine Reading Comprehension and Span Pair Classification for Linking Mathematical Symbols to Their Descriptions", SemEval@NAACL2022 (1st at the all subtasks) https://aclanthology.org/2022.semeval-1.231/korean_extractive_summarization
korean extractive summarization. 2021 AI ํ ์คํธ ์์ฝ ์จ๋ผ์ธ ํด์ปคํค ํ์ฑ๊ฐ๋๋๊นํ ์ฝ๋ (1st/50)MAFiD
Code for "MAFiD: Moving Average Equipped Fusion-in-Decoder for Question Answering over Tabular and Textual Data", EACL2023 FindingsBook-Manager-WPF
2019 ์๋์ฐ์ฆ ํ๋ก๊ทธ๋๋ฐ ๋์๊ด๋ฆฌํ๋ก๊ทธ๋จ(Book Manager) C# WPFA-star-Algorithm-GUI-implementation
A์คํ์๊ณ ๋ฆฌ์ฆ GUI๊ตฌํ (heuristic search)gpt2_finetune_essay
write essay using gpt2Gmlp_pretrain
Pretraining code to train Gmlp Language modelNaver-news-article-classification-using-attention-based-bi-lstm-with-pytorch
์ดํ ์ ๊ธฐ๋ฐ Bi-LSTM์ ์ด์ฉํ ํ๊ตญ์ด ๋ด์ค ๋ถ๋ฅregression-MLP-by-numpy
์ ํํ๊ท, ๋ก์ง์คํฑ ํ๊ท, MLP ๊ตฌํ2021_IR_opendomainqa
2021 ์ ๋ณด๊ฒ์ ๊ณผ์ . ํ๊ตญ์ด ์ํคํผ๋์ ์คํ๋๋ฉ์ธ ์ง์์๋ต ์์คํ Love Open Source and this site? Check out how you can help us