• Stars
    star
    244
  • Rank 165,885 (Top 4 %)
  • Language
    Python
  • Created over 1 year ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Unofficial Implementation of Consistency Models in pytorch

Consistency Models

30 Epoch, Consistency Model with 2 step. Using $t_1 = 2, t_2 = 80$.

30 Epoch, Consistency Model with 5 step. Using $t_i \in {5, 10, 20,40, 80}$.

Unofficial Implementation of Consistency Models (paper) in pytorch.

Three days ago, legendary man Yang Song released entirely new set of generative model, called consistency models. There aren't yet any open implementations, so here is my attempt at it.

What are they?

Diffusion models are amazing, because they enable you to sample high fidelity + high diversity images. Downside is, you need lots of steps, something at least 20.

Progressive Distillation (Salimans & Ho, 2022) solves this with distillating 2-steps of the diffusion model down to single step. Doing this N times boosts sampling speed by $2^N$. But is this the only way? Do we need to train diffusion model and distill it $n$ times? Yang didn't think so. Consistency model solves this by mainly trianing a model to make a consistent denosing for different timesteps (Ok I'm obviously simplifying)

Usage

Install the package with

pip install git+https://github.com/cloneofsimo/consistency_models.git

This repo mainly implements consistency training:

$$ L(\theta) = \mathbb{E}[d(f_\theta(x + t_{n + 1}z, t_{n + 1}), f_{\theta_{-}}(x + t_n z, t_n))] $$

And sampling:

$$ \begin{align} z &\sim \mathcal{N}(0, I) \\ x &\leftarrow x + \sqrt{t_n ^2 - \epsilon^2} z \\ x &\leftarrow f_\theta(x, t_n) \\ \end{align} $$

There is a self-contained MNIST training example on the root main.py.

python main.py

Todo

  • EMA
  • CIFAR10 Example
  • Samples are sooo fuzzy... try to get a crisp result.
  • Consistency Distillation

Reference

@misc{https://doi.org/10.48550/arxiv.2303.01469,
  doi = {10.48550/ARXIV.2303.01469},
  
  url = {https://arxiv.org/abs/2303.01469},
  
  author = {Song, Yang and Dhariwal, Prafulla and Chen, Mark and Sutskever, Ilya},
  
  keywords = {Machine Learning (cs.LG), Computer Vision and Pattern Recognition (cs.CV), Machine Learning (stat.ML), FOS: Computer and information sciences, FOS: Computer and information sciences},
  
  title = {Consistency Models},
  
  publisher = {arXiv},
  
  year = {2023},
  
  copyright = {arXiv.org perpetual, non-exclusive license}
}
@misc{https://doi.org/10.48550/arxiv.2202.00512,
  doi = {10.48550/ARXIV.2202.00512},
  
  url = {https://arxiv.org/abs/2202.00512},
  
  author = {Salimans, Tim and Ho, Jonathan},
  
  keywords = {Machine Learning (cs.LG), Artificial Intelligence (cs.AI), Machine Learning (stat.ML), FOS: Computer and information sciences, FOS: Computer and information sciences},
  
  title = {Progressive Distillation for Fast Sampling of Diffusion Models},
  
  publisher = {arXiv},
  
  year = {2022},
  
  copyright = {arXiv.org perpetual, non-exclusive license}
}

More Repositories

1

lora

Using Low-rank adaptation to quickly fine-tune diffusion models.
Jupyter Notebook
6,852
star
2

minDiffusion

Self-contained, minimalistic implementation of diffusion models with Pytorch.
Python
809
star
3

paint-with-words-sd

Implementation of Paint-with-words with Stable Diffusion : method from eDiff-I that let you generate image from text-labeled segmentation map.
Jupyter Notebook
633
star
4

minSDXL

Huggingface-compatible SDXL Unet implementation that is readily hackable
Jupyter Notebook
367
star
5

minRF

Minimal implementation of scalable rectified flow transformers, based on SD3's approach
Jupyter Notebook
322
star
6

d3pm

Minimal Implementation of a D3PM in pytorch
Jupyter Notebook
159
star
7

min-max-gpt

Minimal (400 LOC) implementation Maximum (multi-node, FSDP) GPT training
Python
104
star
8

realformer-pytorch

Implementation of RealFormer using pytorch
Python
102
star
9

magicmix

Unofficial Implementation of MagicMix
Python
97
star
10

t2i-adapter-diffusers

Python
85
star
11

promptplusplus

Jupyter Notebook
72
star
12

ezmup

Simple implementation of muP, based on Spectral Condition for Feature Learning. The implementation is SGD only, dont use it for Adam
Python
63
star
13

sdxl_inversions

Jupyter Notebook
62
star
14

min-fsdp

Python
62
star
15

karras-power-ema-tutorial

Python
49
star
16

insightful-nn-papers

These papers will provide unique insightful concepts that will broaden your perspective on neural networks and deep learning
42
star
17

clipping-CLIP-to-GAN

Python
40
star
18

fim-llama-deepspeed

Python
31
star
19

min-max-in-dit

Python
26
star
20

imagenet.int8

Python
26
star
21

auto_llm_codebase_analysis

Python
25
star
22

project_RF

Python
22
star
23

inversion_edits

Jupyter Notebook
17
star
24

efae

Python
15
star
25

repa-rf

Python
15
star
26

zeroshot-storytelling

Github repository for Zero Shot Visual Storytelling
Python
15
star
27

ptar

C++
13
star
28

planning-with-diffusion-tutorial

Jupyter Notebook
12
star
29

poly2SOP

Transformer takes a polynomial, expresses it as sum of powers.
Python
11
star
30

smallest_working_performer

Python
10
star
31

reverse_eng_deepspeed_study

DeepSpeed Study, focused on reverse engineering and enhancing documentation
Python
6
star
32

n-body-dynamic-cuda

Cuda
6
star
33

smallest_working_gpt

gpt that is even smaller
Python
6
star
34

minDinoV2

Python
6
star
35

infinite-fractal-stream

Jupyter Notebook
6
star
36

torchcu

Python
5
star
37

lora_dreambooth_replicate

Jupyter Notebook
4
star
38

imgdataset_process

Python
4
star
39

Algorithms-TSNE

How are algorithms really related? We use data from solved.ac and matrix factorization to find out.
Python
4
star
40

rectified-flow

Jupyter Notebook
4
star
41

neural-tsp-pytorch

Python
3
star
42

binclone_python

Python
3
star
43

policy-optimization-torch

Python
2
star
44

samsung_s1t1

Jupyter Notebook
2
star
45

compare_aura_sd3

Vibe check Imagegen models (AuraFlow vs Others)
HTML
2
star
46

culll

Python
2
star
47

latex-quick-figures

atom package
JavaScript
2
star
48

cattalk

Jupyter Notebook
2
star
49

railabweb

Source code for Railab website
HTML
1
star
50

arp-spoofing

C++
1
star
51

cv

Simple CV (pdf, Latex)
1
star
52

send-arp

C++
1
star
53

project_structured_prompt

Jupyter Notebook
1
star
54

netfilter

C
1
star
55

unn-lstm-torch

Python
1
star
56

PGAT

On Using Transformer as Password Guessing Attacker
Python
1
star
57

Super-Simple-LSTM-Template

Python
1
star
58

SemanticSegmentationTrainerTemplate

Python
1
star
59

Freshman_2

Lecture notes from my Freshman 2nd semester
HTML
1
star
60

vqgan-training

Python
1
star
61

tellghsomething

군대간 규현이에게 짧은 편지를 쓰자. 다만 자동으로...
TypeScript
1
star
62

pcap-test

C++
1
star