• Stars
    star
    191
  • Rank 198,601 (Top 4 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created almost 2 years ago
  • Updated 11 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Deploy Your Own Stable Diffusion Service

Serving Stable Diffusion with BentoML

stable diffusion examples

Stable Diffusion is an open-source text-to-image model released by stability.ai. It enables you to generate creative arts from natural language prompts in just seconds. Follow the steps in this repository to create a production-ready Stable Diffusion service with BentoML and deploy it to AWS EC2.

Prepare the Environment

If you don't wish to build the bento from scratch, feel free to download one of the pre-built bentos.

Clone repository and install dependencies:

git clone https://github.com/bentoml/stable-diffusion-bentoml.git && cd stable-diffusion-bentoml
python3 -m venv venv && . venv/bin/activate
pip install -U pip
pip install -r requirements.txt

🎉 Environment is ready!

Create the Stable Diffusion Bento

Here you can choose to either download pre-built Stable Diffusion bentos or build bentos from the Stable Diffusion models.

Download Pre-built Stable Diffusion Bentos

  • Download fp32 bento (for CPU or GPU with more than 10GB VRAM)

    curl -O https://s3.us-west-2.amazonaws.com/bentoml.com/stable_diffusion_bentoml/sd_fp32.bento && bentoml import ./sd_fp32.bento
  • Download fp16 bento (for GPU with less than 10GB VRAM)

    curl -O https://s3.us-west-2.amazonaws.com/bentoml.com/stable_diffusion_bentoml/sd_fp16.bento && bentoml import ./sd_fp16.bento

🎉 The Stable Diffusion bento is imported. You can advance to the "Deploy the Stable Diffusion Bento to EC2" section.

Build from Stable Diffusion Models

Choose a Stable Diffusion model

  • fp32 (for CPU or GPU with more than 10GB VRAM)

     cd fp32/
  • fp16 (for GPU with less than 10GB VRAM)

     cd fp16/

Download the Stable Diffusion model

  • For fp32 model:

     # if tar and gzip is availabe
     curl https://s3.us-west-2.amazonaws.com/bentoml.com/stable_diffusion_bentoml/sd_model_v1_4.tgz | tar zxf - -C models/
    
     # or if unzip is availabe
     curl -O https://s3.us-west-2.amazonaws.com/bentoml.com/stable_diffusion_bentoml/sd_model_v1_4.zip && unzip -d models/ sd_model_v1_4.zip
  • For fp16 model:

     # if tar and gzip is availabe
     curl https://s3.us-west-2.amazonaws.com/bentoml.com/stable_diffusion_bentoml/sd_model_v1_4_fp16.tgz | tar zxf - -C models/
    
     # or if unzip is availabe
     curl -O https://s3.us-west-2.amazonaws.com/bentoml.com/stable_diffusion_bentoml/sd_model_v1_4_fp16.zip && unzip -d models/ sd_model_v1_4_fp16.zip

Run and test the BentoML service:

  • Bring up the BentoML service with the following command.

     BENTOML_CONFIG=configuration.yaml bentoml serve service:svc --production
  • Then you can run one of the scripts to test the service.

     ../txt2img_test.sh
     ../img2img_test.sh

Build a bento:

bentoml build

Building BentoML service "stable_diffusion_fp32:abclxar26s44kcvj" from build context "/Users/ssheng/github/stable-diffusion-bentoml/fp32"
Locking PyPI package versions..

██████╗░███████╗███╗░░██╗████████╗░█████╗░███╗░░░███╗██╗░░░░░
██╔══██╗██╔════╝████╗░██║╚══██╔══╝██╔══██╗████╗░████║██║░░░░░
██████╦╝█████╗░░██╔██╗██║░░░██║░░░██║░░██║██╔████╔██║██║░░░░░
██╔══██╗██╔══╝░░██║╚████║░░░██║░░░██║░░██║██║╚██╔╝██║██║░░░░░
██████╦╝███████╗██║░╚███║░░░██║░░░╚█████╔╝██║░╚═╝░██║███████╗
╚═════╝░╚══════╝╚═╝░░╚══╝░░░╚═╝░░░░╚════╝░╚═╝░░░░░╚═╝╚══════╝

Successfully built Bento(tag="stable_diffusion_fp32:abclxar26s44kcvj")

🎉 The Stable Diffusion bento has been built! You can advance to the "Deploy the Stable Diffusion Bento to EC2" section.

Deploy the Stable Diffusion Bento to EC2

We will be using bentoctl to deploy the bento to EC2. bentoctl helps deploy your bentos into any cloud platform easily. Install the AWS EC2 operator to generate and apply Terraform files to EC2.

bentoctl operator install aws-ec2

The deployment has already been configured for you in the deployment_config.yaml file. By default bentoctl is configured to deploy the model on a g4dn.xlarge instance with Deep Learning AMI GPU PyTorch 1.12.0 (Ubuntu 20.04) AMI on us-west-1.

Note: This default configuration only works in the us-west-1 region. Choose the corresponding AMI Id in your region from AWS AMI Catalog to deploy to your desired region.

Generate the Terraform files.

# In the /bentoctl directory
bentoctl generate -f deployment_config.yaml

✨ generated template files.
  - ./main.tf
  - ./bentoctl.tfvars

Build the Docker image and push to AWS ECR.

bentoctl build -b stable_diffusion_fp32:latest -f deployment_config.yaml

🚀 Image pushed!
✨ generated template files.
  - ./bentoctl.tfvars
  - ./startup_script.sh
  
There is also an experimental command that you can use.
To create the resources specifed run this after the build command.
$ bentoctl apply

To cleanup all the resources created and delete the registry run
$ bentoctl destroy

Apply the Terraform files to deploy to AWS EC2. Head over to the endpoint URL displayed at the end and you can see your Stable Diffusion service is up and running. Run some test prompts to make sure everything is working.

bentoctl apply -f deployment_config.yaml

Apply complete! Resources: 2 added, 0 changed, 0 destroyed.

Outputs:

ec2_instance_status = "running"
endpoint = "http://53.183.151.211"

Finally, delete the deployment if the Stable Diffusion BentoML service is no longer needed.

bentoctl destroy -f deployment_config.yaml

More Repositories

1

OpenLLM

Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.
Python
9,124
star
2

BentoML

The easiest way to serve AI/ML models in production - Build Model Inference Service, LLM APIs, Multi-model Inference Graph/Pipelines, LLM/RAG apps, and more!
Python
6,714
star
3

Yatai

Model Deployment at Scale on Kubernetes 🦄️
TypeScript
771
star
4

OneDiffusion

OneDiffusion: Run any Stable Diffusion models and fine-tuned weights with ease
Python
325
star
5

bentoctl

Fast model deployment on any cloud 🚀
Python
172
star
6

gallery

BentoML Example Projects 🎨
Python
134
star
7

OCR-as-a-Service

Turn any OCR models into online inference API endpoint 🚀 🌖
Python
47
star
8

transformers-nlp-service

Online Inference API for NLP Transformer models - summarization, text classification, sentiment analysis and more
Python
41
star
9

CLIP-API-service

CLIP as a service - Embed image and sentences, object recognition, visual reasoning, image classification and reverse image search
Jupyter Notebook
36
star
10

BentoVLLM

Self-host LLMs with vLLM and BentoML
Python
32
star
11

simple_di

Simple dependency injection framework for Python
Python
19
star
12

yatai-deployment

🚀 Launching Bento in a Kubernetes cluster
Go
16
star
13

Fraud-Detection-Model-Serving

Online model serving with Fraud Detection model trained with XGBoost on IEEE-CIS dataset
Jupyter Notebook
14
star
14

aws-sagemaker-deploy

Fast model deployment on AWS Sagemaker
Python
14
star
15

yatai-image-builder

🐳 Build OCI images for Bentos in k8s
Go
14
star
16

sentence-embedding-bento

Sentence Embedding as a Service
Jupyter Notebook
14
star
17

google-cloud-run-deploy

Fast model deployment on Google Cloud Run
Python
13
star
18

aws-lambda-deploy

Fast model deployment on AWS Lambda
Python
13
star
19

aws-ec2-deploy

Fast model deployment on AWS EC2
Python
13
star
20

IF-multi-GPUs-demo

Python
13
star
21

rag-tutorials

a series of tutorials implementing rag service with BentoML and LlamaIndex
Python
11
star
22

diffusers-examples

API serving for your diffusers models
Python
10
star
23

BentoSVD

Python
9
star
24

Pneumonia-Detection-Demo

Pneumonia Detection - Healthcare Imaging Application built with BentoML and fine-tuned Vision Transformer (ViT) model
Python
8
star
25

yatai-chart

Helm Chart for installing Yatai on Kubernetes ⎈
Mustache
7
star
26

benchmark

BentoML Performance Benchmark 🆚
Jupyter Notebook
7
star
27

plugins

the swish knife to all things bentoml.
Starlark
6
star
28

bentoctl-operator-template

Python
6
star
29

heroku-deploy

Deploy BentoML bundled models to Heroku
Python
6
star
30

BentoLMDeploy

Self-host LLMs with LMDeploy and BentoML
Python
5
star
31

bentoml-core

Rust
5
star
32

BentoControlNet

Python
4
star
33

BentoWhisperX

Python
4
star
34

google-compute-engine-deploy

HCL
4
star
35

BentoCLIP

building a CLIP application using BentoML
Python
4
star
36

BentoRAG

Tutorial: Build RAG Apps with Custom Models Served with BentoML
Python
4
star
37

quickstart

BentoML Quickstart Example
Python
4
star
38

deploy-bento-action

A GitHub Action to deploy bento to cloud
3
star
39

azure-functions-deploy

Fast model deployment on Azure Functions
Python
3
star
40

azure-container-instances-deploy

Fast model deployment on Azure container instances
Python
3
star
41

containerize-push-action

docker's build-and-push-action equivalent for bentoml
TypeScript
3
star
42

BentoSentenceTransformers

how to build a sentence embedding application using BentoML
Python
2
star
43

BentoTRTLLM

Python
2
star
44

bentoml-arize-fraud-detection-workshop

Jupyter Notebook
2
star
45

BentoSDXLTurbo

how to build an image generation application using BentoML
Python
2
star
46

yatai-schemas

Go
1
star
47

bentoctl-workshops

Python
1
star
48

llm-bench

Python
1
star
49

bentocloud-homepage-news

1
star
50

yatai-common

Go
1
star
51

BentoBLIP

how to build an image captioning application on top of a BLIP model with BentoML
Python
1
star
52

BentoYolo

BentoML service of YOLO v8
Python
1
star
53

.github

✨🍱🦄️
1
star
54

BentoBark

Python
1
star
55

BentoMLCLLM

Python
1
star
56

BentoTGI

Python
1
star
57

openllm-benchmark

Python
1
star