• Stars
    star
    117
  • Rank 301,828 (Top 6 %)
  • Language
  • Created about 2 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

A paper list about diffusion models for natural language processing.

Diffusion4NLP-Papers

A paper list about diffusion models for natural language processing.

Update News:

  • Nov. 9, 2022, Add 3 pre-print papers.
  • Oct. 18, 2022, Add papers that attempt to apply diffusion models for NLP from scratch.

Conference Paper

  1. Diffusion-LM Improves Controllable Text Generation. Xiang Lisa Li, John Thickstun, Ishaan Gulrajani, Percy Liang, Tatsunori B. Hashimoto. In NeuralPS 2022. [pdf] [code]
  2. DiffuSeq: Sequence to Sequence Text Generation with Diffusion Models. Shansan Gong, Mukai Li, Jiangtao Feng, Zhiyong Wu, LingPeng Kong. ICLR 2023. [pdf] [code]
  3. Latent Diffusion Energy-Based Model for Interpretable Text Modeling. Peiyu Yu, Sirui Xie, Xiaojian Ma, Baoxiong Jia, Bo Pang, Ruiqi Gao, Yixin Zhu, Song-Chun Zhu, Ying Nian Wu. [pdf] [code]
  4. Analog Bits: Generating Discrete Data using Diffusion Models with Self-Conditioning. Ting Chen, Ruixiang Zhang, Geoffrey Hinton. [pdf] [code]
  5. Structured Denoising Diffusion Models in Discrete State-Spaces. Jacob Austin, Daniel D. Johnson, Jonathan Ho, Daniel Tarlow, Rianne van den Berg. [pdf]
  6. Composable Text Controls in Latent Space with ODEs. Guangyi Liu, Zeyu Feng, Yuan Gao, Zichao Yang, Xiaodan Liang, Junwei Bao, Xiaodong He, Shuguang Cui, Zhen Li, Zhiting Hu. [pdf]
  7. DiffusER: Discrete Diffusion via Edit-based Reconstruction. Machel Reid, Vincent J. Hellendoorn, Graham Neubig. ICLR 2023. [pdf]
  8. Self-conditioned Embedding Diffusion for Text Generation. Robin Strudel, Corentin Tallec, Florent Altché, Yilun Du, Yaroslav Ganin, Arthur Mensch, Will Grathwohl, Nikolay Savinov, Sander Dieleman, Laurent Sifre, Rémi Leblond. [pdf]
  9. DiffusionBERT: Improving Generative Masked Language Models with Diffusion Models. Zhengfu He, Tianxiang Sun, Kuanning Wang, Xuanjing Huang, Xipeng Qiu. [pdf] [code]

Survey

  1. DIFFUSION MODELS: A COMPREHENSIVE SURVEY OFMETHODS AND APPLICATIONS. Ling Yang, Zhilong Zhang, Shenda Hong, Wentao Zhang, Bin Cui. [pdf]

Comprehensive DM Papers

  1. YangLing0818/Diffusion-Models-Papers-Survey-Taxonomy

More Repositories

1

GNN4NLP-Papers

A list of recent papers about Graph Neural Network methods applied in NLP areas.
837
star
2

KMRC-Papers

A list of recent papers regarding knowledge-based machine reading comprehension.
41
star
3

ML-ATIC

Abnormal Traffic Identification Classifier based on Machine Learning. My code for undergraduate graduation design.
Java
26
star
4

NLPer-Conferences-Journals-Survey

Survey of NLP+AI Conferences and Journals for NLPers
22
star
5

MSMARCO-MRC-Analysis

Analysis on the MS-MARCO leaderboard regarding the machine reading comprehension task.
20
star
6

COMMA

The code of COMMA: Modeling Relationship among Motivations, Emotions and Actions in Language-based Human Activities. https://aclanthology.org/2022.coling-1.15.pdf In COLING 2022.
Python
7
star
7

Learn-python-with-socratica-My-notes

My notes about "Learn python with socratica"
Jupyter Notebook
5
star
8

Analysis-of-Story-Cloze-Test-and-ROCStories

Analysis of Story-Cloze-Test (SCT) task and its training set (ROCStories). History, developments and future works.
4
star
9

CLSEG

Codes and Date for CLSEG in ICASSP 2022. https://ieeexplore.ieee.org/document/9747435/
Python
3
star
10

IIE-NLP-Eyas-SemEval2021

Code of IIE-NLP-Eyas Team for ReCAM (Task 4) @SemEval2021 (https://arxiv.org/abs/2102.12777)
Python
1
star
11

Learning-Code-by-Multiway-Attention-Networks

learning the baseline for AIchallenger18
Python
1
star
12

LongLM-Eyas

Implement of IIE-NLP-Eyas@OutGen: Chinese Outline-guided Story Generation via a Progressive Plot-Event-Story Framework
Python
1
star
13

Speech2Text

Speech to Text with Xun Fei API.
Python
1
star
14

Self-Supervised-Learning-for-NLP

A research on self-supervised learning with the interest of applying it into NLP field.
1
star