• Stars
    star
    51
  • Rank 568,706 (Top 12 %)
  • Language
  • Created over 4 years ago
  • Updated over 3 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

domain adaptation in NLP

More Repositories

1

ChatGPTPapers

Must-read papers, related blogs and API tools on the pre-training and tuning methods for ChatGPT.
315
star
2

active-prompt

Source code for the paper "Active Prompting with Chain-of-Thought for Large Language Models"
Python
209
star
3

R-Tuning

[NAACL 2024 Outstanding Paper] Source code for the NAACL 2024 paper entitled "R-Tuning: Instructing Large Language Models to Say 'I Don't Know'"
Python
80
star
4

DaVinci

Source code for the paper "Prefix Language Models are Unified Modal Learners"
Jupyter Notebook
42
star
5

TILGAN

Source code for the Findings of ACL-IJCNLP 2021 paper entitled "TILGAN: Transformer-based Implicit Latent GAN for Diverse and Coherent Text Generation"
Python
26
star
6

automate-cot

Source code for the paper "Automatic Prompt Augmentation and Selection with Chain-of-Thought from Labeled Data"
20
star
7

T-DNA

Source code for the ACL-IJCNLP 2021 paper entitled "T-DNA: Taming Pre-trained Language Models with N-gram Representations for Low-Resource Domain Adaptation" by Shizhe Diao et al.
Python
19
star
8

BigGAN-PyTorch-TPU-Distribute

Distributed version (multiple-process) for training BigGAN with TPU.
Python
9
star
9

Post-Training-Data-Flywheel

We aim to provide the best references to search, select, and synthesize high-quality and large-quantity data for post-training your LLMs.
Python
9
star
10

awesome-transformers

A curated list of resources dedicated to Transformers.
8
star
11

HashTation

Source code for the paper "Hashtag-Guided Low-Resource Tweet Classification"
Python
5
star
12

Transformers_TPU

transformers_TPU, trying to solve RAM issues with mapping dataset
Python
3
star
13

BigGAN-PyTorch-TPU-Single

Single thread version for training BigGAN with TPU.
Python
3
star
14

SEDST3

SEDST version 3.0 base on the Code for CIKM'18 long paper: Explicit state tracking with semi-supervision for neural dialogue generation.
Python
3
star
15

Doolittle

Source code for the EMNLP 2023 paper entitled "Doolittle: Benchmarks and Corpora for Academic Writing Formalization" by Shizhe Diao et al.
Python
3
star
16

BigGAN-PyTorch-TPU-Parallel

Parallel version (multiple-thread) for training BigGAN with TPU.
Python
2
star
17

Black-Box-Prompt-Learning

Source code for the paper "Black-Box Prompt Learning for Pre-trained Language Models"
2
star
18

TPU-Tutorial

This is a tutorial for beginners who would like to use TPU with Pytorch.
1
star
19

MATH6450-CIFAR10

Course project for MATH6450F training two models onn CIFAR-10 to achieve a good performance. The code is adapted from CIFAR-ZOO (https://github.com/BIGBALLON/CIFAR-ZOO)
Python
1
star