There are no reviews yet. Be the first to send feedback to the community and the maintainers!
korean-malicious-comments-dataset
νκ΅μ΄ μ μ±λκΈ λ°μ΄ν°μ CPFT
(unofficial) Code for "Few-Shot Intent Detection via Contrastive Pre-Training and Fine-Tuning", EMNLP 2021RADCoT
Code for "RADCoT: Retrieval-Augmented Distillation to Specialization Models for Generating Chain-of-Thoughts in Query Expansion", LREC-COLING 2024pytorch_lightning_transformer
(unofficial) Code for "Attention is all you need" NIPS 20172021_korean_competition
2021 κ΅λ¦½κ΅μ΄μ μΈκ³΅ μ§λ₯ μΈμ΄ λ₯λ ₯ νκ° νμ±κ°λλκΉ Team (5th/31)symlink
Code for "JBNU-CCLab at SemEval-2022 Task 12: Machine Reading Comprehension and Span Pair Classification for Linking Mathematical Symbols to Their Descriptions", SemEval@NAACL2022 (1st at the all subtasks) https://aclanthology.org/2022.semeval-1.231/korean_extractive_summarization
korean extractive summarization. 2021 AI ν μ€νΈ μμ½ μ¨λΌμΈ ν΄μ»€ν€ νμ±κ°λλκΉν μ½λ (1st/50)MAFiD
Code for "MAFiD: Moving Average Equipped Fusion-in-Decoder for Question Answering over Tabular and Textual Data", EACL2023 FindingsBook-Manager-WPF
2019 μλμ°μ¦ νλ‘κ·Έλλ° λμκ΄λ¦¬νλ‘κ·Έλ¨(Book Manager) C# WPFA-star-Algorithm-GUI-implementation
Aμ€νμκ³ λ¦¬μ¦ GUIꡬν (heuristic search)pytorch_lightning_bert
BERT implementation from scratchGmlp_pretrain
Pretraining code to train Gmlp Language modelNaver-news-article-classification-using-attention-based-bi-lstm-with-pytorch
μ΄ν μ κΈ°λ° Bi-LSTMμ μ΄μ©ν νκ΅μ΄ λ΄μ€ λΆλ₯regression-MLP-by-numpy
μ ννκ·, λ‘μ§μ€ν± νκ·, MLP ꡬν2021_IR_opendomainqa
2021 μ 보κ²μ κ³Όμ . νκ΅μ΄ μν€νΌλμ μ€νλλ©μΈ μ§μμλ΅ μμ€ν Love Open Source and this site? Check out how you can help us