• Stars
    star
    367
  • Rank 116,257 (Top 3 %)
  • Language
    Jupyter Notebook
  • Created over 1 year ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Huggingface-compatible SDXL Unet implementation that is readily hackable

Huggingface Diffusers Compatible SDXL Unet Rewrite

Find this useful? I'm accumulating coffee until they become a A100 gpu.

Buy Me A Coffee

Why do this?

While huggingface diffusers library is amazing, nowdays, its UNet2DConditionModel's implementation has gotten extremely big. There are various configurations, branches during initialization, and many other "important-yet-not-related" stuff that make it hard to understand. (I am also partly to blame this, because LoRA with AttentionProcessor logic has also gotten rather huge part of the Unet implementation.) I would argue it got bigger than one researcher can now reasonably understand and maintain, let alone extend. This will of course have pros and cons, but for many researchers, this is not ideal. (Trust me, I use diffusers all the time and still get confused by the codebase.)

Since SDXL will likely be used by many researchers, I think it is very important to have concise implementations of the models, so that SDXL can be easily understood and extended.

sdxl_rewrite.py tries to remove all the unnecessary parts of the original implementation, and tries to make it as concise as possible.

Usage

SDXL Rewrite tries to be directly compatible with the original diffusers library. You can use it like this:

from sdxl_rewrite import UNet2DConditionModel
unet_new = UNet2DConditionModel().cuda().half()

# Load weights from the original model
from diffusers import DiffusionPipeline


pipe = DiffusionPipeline.from_pretrained(
        "stabilityai/stable-diffusion-xl-base-1.0",
        torch_dtype=torch.float16,
        use_safetensors=True,
        variant="fp16",
).to("cuda")

unet_new.load_state_dict(pipe.unet.state_dict())

# use the weights
pipe.unet = unet_new

Obviously in practice you would never do this. You would normaly copy this codebase and modify it to your needs, like putting adapters, loras, other modalities, etc etc.

Have a look at example.ipynb for bit more examples to use it directly with diffusers library.

More Repositories

1

lora

Using Low-rank adaptation to quickly fine-tune diffusion models.
Jupyter Notebook
6,852
star
2

minDiffusion

Self-contained, minimalistic implementation of diffusion models with Pytorch.
Python
809
star
3

paint-with-words-sd

Implementation of Paint-with-words with Stable Diffusion : method from eDiff-I that let you generate image from text-labeled segmentation map.
Jupyter Notebook
633
star
4

minRF

Minimal implementation of scalable rectified flow transformers, based on SD3's approach
Jupyter Notebook
322
star
5

consistency_models

Unofficial Implementation of Consistency Models in pytorch
Python
244
star
6

d3pm

Minimal Implementation of a D3PM in pytorch
Jupyter Notebook
159
star
7

min-max-gpt

Minimal (400 LOC) implementation Maximum (multi-node, FSDP) GPT training
Python
104
star
8

realformer-pytorch

Implementation of RealFormer using pytorch
Python
102
star
9

magicmix

Unofficial Implementation of MagicMix
Python
97
star
10

t2i-adapter-diffusers

Python
85
star
11

promptplusplus

Jupyter Notebook
72
star
12

ezmup

Simple implementation of muP, based on Spectral Condition for Feature Learning. The implementation is SGD only, dont use it for Adam
Python
63
star
13

sdxl_inversions

Jupyter Notebook
62
star
14

min-fsdp

Python
62
star
15

karras-power-ema-tutorial

Python
49
star
16

insightful-nn-papers

These papers will provide unique insightful concepts that will broaden your perspective on neural networks and deep learning
42
star
17

clipping-CLIP-to-GAN

Python
40
star
18

fim-llama-deepspeed

Python
31
star
19

min-max-in-dit

Python
26
star
20

imagenet.int8

Python
26
star
21

auto_llm_codebase_analysis

Python
25
star
22

project_RF

Python
22
star
23

inversion_edits

Jupyter Notebook
17
star
24

efae

Python
15
star
25

repa-rf

Python
15
star
26

zeroshot-storytelling

Github repository for Zero Shot Visual Storytelling
Python
15
star
27

ptar

C++
13
star
28

planning-with-diffusion-tutorial

Jupyter Notebook
12
star
29

poly2SOP

Transformer takes a polynomial, expresses it as sum of powers.
Python
11
star
30

smallest_working_performer

Python
10
star
31

reverse_eng_deepspeed_study

DeepSpeed Study, focused on reverse engineering and enhancing documentation
Python
6
star
32

n-body-dynamic-cuda

Cuda
6
star
33

smallest_working_gpt

gpt that is even smaller
Python
6
star
34

minDinoV2

Python
6
star
35

infinite-fractal-stream

Jupyter Notebook
6
star
36

torchcu

Python
5
star
37

lora_dreambooth_replicate

Jupyter Notebook
4
star
38

imgdataset_process

Python
4
star
39

Algorithms-TSNE

How are algorithms really related? We use data from solved.ac and matrix factorization to find out.
Python
4
star
40

rectified-flow

Jupyter Notebook
4
star
41

neural-tsp-pytorch

Python
3
star
42

binclone_python

Python
3
star
43

policy-optimization-torch

Python
2
star
44

samsung_s1t1

Jupyter Notebook
2
star
45

compare_aura_sd3

Vibe check Imagegen models (AuraFlow vs Others)
HTML
2
star
46

culll

Python
2
star
47

latex-quick-figures

atom package
JavaScript
2
star
48

cattalk

Jupyter Notebook
2
star
49

railabweb

Source code for Railab website
HTML
1
star
50

arp-spoofing

C++
1
star
51

cv

Simple CV (pdf, Latex)
1
star
52

send-arp

C++
1
star
53

project_structured_prompt

Jupyter Notebook
1
star
54

netfilter

C
1
star
55

unn-lstm-torch

Python
1
star
56

PGAT

On Using Transformer as Password Guessing Attacker
Python
1
star
57

Super-Simple-LSTM-Template

Python
1
star
58

SemanticSegmentationTrainerTemplate

Python
1
star
59

Freshman_2

Lecture notes from my Freshman 2nd semester
HTML
1
star
60

vqgan-training

Python
1
star
61

tellghsomething

군대간 규현이에게 짧은 편지를 쓰자. 다만 자동으로...
TypeScript
1
star
62

pcap-test

C++
1
star