• Stars
    star
    788
  • Rank 57,762 (Top 2 %)
  • Language
    TypeScript
  • License
    Other
  • Created over 3 years ago
  • Updated 6 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Model Deployment at Scale on Kubernetes 🦄️

🦄️ Yatai: Model Deployment at Scale on Kubernetes

actions_status docs join_slack

Yatai (屋台, food cart) lets you deploy, operate and scale Machine Learning services on Kubernetes.

It supports deploying any ML models via BentoML: the unified model serving framework.

yatai-overview-page

👉 Join our Slack community today!

Looking for the fastest way to give Yatai a try? Check out BentoML Cloud to get started today.


Why Yatai?

🍱 Made for BentoML, deploy at scale

  • Scale BentoML to its full potential on a distributed system, optimized for cost saving and performance.
  • Manage deployment lifecycle to deploy, update, or rollback via API or Web UI.
  • Centralized registry providing the foundation for CI/CD via artifact management APIs, labeling, and WebHooks for custom integration.

🚅 Cloud native & DevOps friendly

  • Kubernetes-native workflow via BentoDeployment CRD (Custom Resource Definition), which can easily fit into an existing GitOps workflow.
  • Native integration with Grafana stack for observability.
  • Support for traffic control with Istio.
  • Compatible with all major cloud platforms (AWS, Azure, and GCP).

Getting Started

  • 📖 Documentation - Overview of the Yatai docs and related resources
  • ⚙️ Installation - Hands-on instruction on how to install Yatai for production use
  • 👉 Join Community Slack - Get help from our community and maintainers

Quick Tour

Let's try out Yatai locally in a minikube cluster!

⚙️ Prerequisites:

  • Install latest minikube: https://minikube.sigs.k8s.io/docs/start/
  • Install latest Helm: https://helm.sh/docs/intro/install/
  • Start a minikube Kubernetes cluster: minikube start --cpus 4 --memory 4096, if you are using macOS, you should use hyperkit driver to prevent the macOS docker desktop network limitation
  • Check that minikube cluster status is "running": minikube status
  • Make sure your kubectl is configured with minikube context: kubectl config current-context
  • Enable ingress controller: minikube addons enable ingress

🚧 Install Yatai

Install Yatai with the following script:

bash <(curl -s "https://raw.githubusercontent.com/bentoml/yatai/main/scripts/quick-install-yatai.sh")

This script will install Yatai along with its dependencies (PostgreSQL and MinIO) on your minikube cluster.

Note that this installation script is made for development and testing use only. For production deployment, check out the Installation Guide.

To access Yatai web UI, run the following command and keep the terminal open:

kubectl --namespace yatai-system port-forward svc/yatai 8080:80

In a separate terminal, run:

YATAI_INITIALIZATION_TOKEN=$(kubectl get secret yatai-env --namespace yatai-system -o jsonpath="{.data.YATAI_INITIALIZATION_TOKEN}" | base64 --decode)
echo "Open in browser: http://127.0.0.1:8080/setup?token=$YATAI_INITIALIZATION_TOKEN"

Open the URL printed above from your browser to finish admin account setup.

🍱 Push Bento to Yatai

First, get an API token and login to the BentoML CLI:

  • Keep the kubectl port-forward command in the step above running

  • Go to Yatai's API tokens page: http://127.0.0.1:8080/api_tokens

  • Create a new API token from the UI, making sure to assign "API" access under "Scopes"

  • Copy the login command upon token creation and run as a shell command, e.g.:

    bentoml yatai login --api-token {YOUR_TOKEN} --endpoint http://127.0.0.1:8080

If you don't already have a Bento built, run the following commands from the BentoML Quickstart Project to build a sample Bento:

git clone https://github.com/bentoml/bentoml.git && cd ./examples/quickstart
pip install -r ./requirements.txt
python train.py
bentoml build

Push your newly built Bento to Yatai:

bentoml push iris_classifier:latest

Now you can view and manage models and bentos from the web UI:

yatai-bento-repos

yatai-model-detail

🔧 Install yatai-image-builder component

Yatai's image builder feature comes as a separate component, you can install it via the following script:

bash <(curl -s "https://raw.githubusercontent.com/bentoml/yatai-image-builder/main/scripts/quick-install-yatai-image-builder.sh")

This will install the BentoRequest CRD(Custom Resource Definition) and Bento CRD in your cluster. Similarly, this script is made for development and testing purposes only.

🔧 Install yatai-deployment component

Yatai's Deployment feature comes as a separate component, you can install it via the following script:

bash <(curl -s "https://raw.githubusercontent.com/bentoml/yatai-deployment/main/scripts/quick-install-yatai-deployment.sh")

This will install the BentoDeployment CRD(Custom Resource Definition) in your cluster and enable the deployment UI on Yatai. Similarly, this script is made for development and testing purposes only.

🚢 Deploy Bento!

Once the yatai-deployment component was installed, Bentos pushed to Yatai can be deployed to your Kubernetes cluster and exposed via a Service endpoint.

A Bento Deployment can be created either via Web UI or via a Kubernetes CRD config:

Option 1. Simple Deployment via Web UI

yatai-deployment-creation

Option 2. Deploy with kubectl & CRD

Define your Bento deployment in a my_deployment.yaml file:

apiVersion: resources.yatai.ai/v1alpha1
kind: BentoRequest
metadata:
    name: iris-classifier
    namespace: yatai
spec:
    bentoTag: iris_classifier:3oevmqfvnkvwvuqj
---
apiVersion: serving.yatai.ai/v2alpha1
kind: BentoDeployment
metadata:
    name: my-bento-deployment
    namespace: yatai
spec:
    bento: iris-classifier
    ingress:
        enabled: true
    resources:
        limits:
            cpu: "500m"
            memory: "512m"
        requests:
            cpu: "250m"
            memory: "128m"
    autoscaling:
        maxReplicas: 10
        minReplicas: 2
    runners:
        - name: iris_clf
          resources:
              limits:
                  cpu: "1000m"
                  memory: "1Gi"
              requests:
                  cpu: "500m"
                  memory: "512m"
          autoscaling:
              maxReplicas: 4
              minReplicas: 1

Apply the deployment to your minikube cluster:

kubectl apply -f my_deployment.yaml

Now you can see the deployment process from the Yatai Web UI and find the endpoint URL for accessing the deployed Bento.

yatai-deployment-details

Community

Contributing

There are many ways to contribute to the project:

  • If you have any feedback on the project, share it with the community in GitHub Discussions under the BentoML repo.
  • Report issues you're facing and "Thumbs up" on issues and feature requests that are relevant to you.
  • Investigate bugs and review other developers' pull requests.
  • Contributing code or documentation to the project by submitting a GitHub pull request. See the development guide.

Usage Reporting

Yatai collects usage data that helps our team to improve the product. Only Yatai's internal API calls are being reported. We strip out as much potentially sensitive information as possible, and we will never collect user code, model data, model names, or stack traces. Here's the code for usage tracking. You can opt-out of usage by configuring the helm chart option doNotTrack to true.

doNotTrack: false

Or by setting the YATAI_DONOT_TRACK env var in yatai deployment.

spec:
  template:
    spec:
      containers:
        env:
        - name: YATAI_DONOT_TRACK
          value: "true"

Licence

Elastic License 2.0 (ELv2)

More Repositories

1

OpenLLM

Run any open-source LLMs, such as Llama 3.1, Gemma, as OpenAI compatible API endpoint in the cloud.
Python
9,813
star
2

BentoML

The easiest way to serve AI apps and models - Build reliable Inference APIs, LLM apps, Multi-model chains, RAG service, and much more!
Python
7,025
star
3

BentoDiffusion

BentoDiffusion: A collection of diffusion models served with BentoML
Python
331
star
4

stable-diffusion-server

Deploy Your Own Stable Diffusion Service
Python
196
star
5

bentoctl

Fast model deployment on any cloud 🚀
Python
175
star
6

gallery

BentoML Example Projects 🎨
Python
134
star
7

BentoVLLM

Self-host LLMs with vLLM and BentoML
Python
64
star
8

OCR-as-a-Service

Turn any OCR models into online inference API endpoint 🚀 🌖
Python
49
star
9

CLIP-API-service

CLIP as a service - Embed image and sentences, object recognition, visual reasoning, image classification and reverse image search
Jupyter Notebook
48
star
10

transformers-nlp-service

Online Inference API for NLP Transformer models - summarization, text classification, sentiment analysis and more
Python
43
star
11

llm-bench

Python
28
star
12

rag-tutorials

a series of tutorials implementing rag service with BentoML and LlamaIndex
Python
23
star
13

simple_di

Simple dependency injection framework for Python
Python
21
star
14

BentoChatTTS

Python
21
star
15

Fraud-Detection-Model-Serving

Online model serving with Fraud Detection model trained with XGBoost on IEEE-CIS dataset
Jupyter Notebook
16
star
16

yatai-deployment

🚀 Launching Bento in a Kubernetes cluster
Go
16
star
17

google-cloud-run-deploy

Fast model deployment on Google Cloud Run
Python
15
star
18

aws-sagemaker-deploy

Fast model deployment on AWS Sagemaker
Python
15
star
19

aws-lambda-deploy

Fast model deployment on AWS Lambda
Python
14
star
20

aws-ec2-deploy

Fast model deployment on AWS EC2
Python
14
star
21

BentoLMDeploy

Self-host LLMs with LMDeploy and BentoML
Python
14
star
22

yatai-image-builder

🐳 Build OCI images for Bentos in k8s
Go
14
star
23

sentence-embedding-bento

Sentence Embedding as a Service
Jupyter Notebook
14
star
24

IF-multi-GPUs-demo

Python
12
star
25

openllm-models

Python
10
star
26

BentoSVD

Python
10
star
27

BentoWhisperX

Python
10
star
28

diffusers-examples

API serving for your diffusers models
Python
10
star
29

BentoCLIP

building a CLIP application using BentoML
Python
8
star
30

Pneumonia-Detection-Demo

Pneumonia Detection - Healthcare Imaging Application built with BentoML and fine-tuned Vision Transformer (ViT) model
Python
8
star
31

yatai-chart

Helm Chart for installing Yatai on Kubernetes ⎈
Mustache
7
star
32

benchmark

BentoML Performance Benchmark 🆚
Jupyter Notebook
7
star
33

BentoTRTLLM

Python
6
star
34

plugins

the swish knife to all things bentoml.
Starlark
6
star
35

bentoctl-operator-template

Python
6
star
36

heroku-deploy

Deploy BentoML bundled models to Heroku
Python
6
star
37

quickstart

BentoML Quickstart Example
Python
6
star
38

BentoSentenceTransformers

how to build a sentence embedding application using BentoML
Python
5
star
39

BentoYolo

BentoML service of YOLO v8
Python
5
star
40

google-compute-engine-deploy

HCL
5
star
41

bentoml-core

Rust
5
star
42

BentoControlNet

Python
4
star
43

BentoBark

Python
4
star
44

BentoRAG

Tutorial: Build RAG Apps with Custom Models Served with BentoML
Python
4
star
45

BentoXTTS

how to build an text-to-speech application using BentoML
Python
4
star
46

containerize-push-action

docker's build-and-push-action equivalent for bentoml
TypeScript
4
star
47

BentoBLIP

how to build an image captioning application on top of a BLIP model with BentoML
Python
3
star
48

deploy-bento-action

A GitHub Action to deploy bento to cloud
3
star
49

azure-functions-deploy

Fast model deployment on Azure Functions
Python
3
star
50

azure-container-instances-deploy

Fast model deployment on Azure container instances
Python
3
star
51

BentoFunctionCalling

Python
3
star
52

llm-router

LLM Router Demo
Python
3
star
53

BentoResnet

Python
2
star
54

bentoml-arize-fraud-detection-workshop

Jupyter Notebook
2
star
55

BentoSDXLTurbo

how to build an image generation application using BentoML
Python
2
star
56

BentoSearch

Search with LLM
Python
2
star
57

BentoInfinity

Python
2
star
58

BentoMLCLLM

Python
2
star
59

yatai-schemas

Go
1
star
60

bentoctl-workshops

Python
1
star
61

bentocloud-homepage-news

1
star
62

yatai-common

Go
1
star
63

BentoMoirai

Python
1
star
64

.github

✨🍱🦄️
1
star
65

bentoml-unsloth

BentoML Unsloth integration
Python
1
star
66

BentoShield

Python
1
star
67

LLMGateway

Python
1
star
68

BentoTGI

Python
1
star
69

openllm-benchmark

Python
1
star