• This repository has been archived on 26/Feb/2024
  • Stars
    star
    175
  • Rank 218,059 (Top 5 %)
  • Language
    Python
  • License
    Other
  • Created about 3 years ago
  • Updated 9 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Fast model deployment on any cloud 🚀

🚀 Fast model deployment on any cloud

actions_status docs join_slack

bentoctl helps deploy any machine learning models as production-ready API endpoints on the cloud, supporting AWS SageMaker, AWS Lambda, EC2, Google Compute Engine, Azure, Heroku and more.

👉 Join our Slack community today!

✨ Looking deploy your ML service quickly? You can checkout BentoML Cloud for the easiest and fastest way to deploy your bento. It's a full featured, serverless environment with a model repository and built in monitoring and logging.

Highlights

  • Framework-agnostic model deployment for Tensorflow, PyTorch, XGBoost, Scikit-Learn, ONNX, and many more via BentoML: the unified model serving framework.
  • Simplify the deployment lifecycle of deploy, update, delete, and rollback.
  • Take full advantage of BentoML's performance optimizations and cloud platform features out-of-the-box.
  • Tailor bentoctl to your DevOps needs by customizing deployment operator and Terraform templates.

Getting Started

Supported Platforms:

Upcoming

Custom Operator

Users can built custom bentoctl plugin from the deployment operator template to deploy to cloud platforms not yet supported or to internal infrastructure.

If you are looking for deploying with Kubernetes, check out Yatai: Model deployment at scale on Kubernetes.

Installation

pip install bentoctl

| 💡 bentoctl designed to work with BentoML version 1.0.0 and above. For BentoML 0.13 or below, you can use the pre-v1.0 branch in the operator repositories and follow the instruction in the README. You can also check out the quickstart guide for 0.13 here.

Community

Contributing

There are many ways to contribute to the project:

  • Create and share new operators. Use deployment operator template to get started.
  • If you have any feedback on the project, share it with the community in Github Discussions under the BentoML repo.
  • Report issues you're facing and "Thumbs up" on issues and feature requests that are relevant to you.
  • Investigate bugs and reviewing other developer's pull requests.

Usage Reporting

BentoML and bentoctl collects usage data that helps our team to improve the product. Only bentoctl's CLI commands calls are being reported. We strip out as much potentially sensitive information as possible, and we will never collect user code, model data, model names, or stack traces. Here's the code for usage tracking. You can opt-out of usage tracking by setting environment variable BENTOML_DO_NOT_TRACK=True:

export BENTOML_DO_NOT_TRACK=True

Licence

Elastic License 2.0 (ELv2)

More Repositories

1

OpenLLM

Run any open-source LLMs, such as Llama 3.1, Gemma, as OpenAI compatible API endpoint in the cloud.
Python
9,813
star
2

BentoML

The easiest way to serve AI apps and models - Build reliable Inference APIs, LLM apps, Multi-model chains, RAG service, and much more!
Python
7,025
star
3

Yatai

Model Deployment at Scale on Kubernetes đŸĻ„ī¸
TypeScript
788
star
4

BentoDiffusion

BentoDiffusion: A collection of diffusion models served with BentoML
Python
331
star
5

stable-diffusion-server

Deploy Your Own Stable Diffusion Service
Python
196
star
6

gallery

BentoML Example Projects 🎨
Python
134
star
7

BentoVLLM

Self-host LLMs with vLLM and BentoML
Python
64
star
8

OCR-as-a-Service

Turn any OCR models into online inference API endpoint 🚀 🌖
Python
49
star
9

CLIP-API-service

CLIP as a service - Embed image and sentences, object recognition, visual reasoning, image classification and reverse image search
Jupyter Notebook
48
star
10

transformers-nlp-service

Online Inference API for NLP Transformer models - summarization, text classification, sentiment analysis and more
Python
43
star
11

llm-bench

Python
28
star
12

rag-tutorials

a series of tutorials implementing rag service with BentoML and LlamaIndex
Python
23
star
13

simple_di

Simple dependency injection framework for Python
Python
21
star
14

BentoChatTTS

Python
21
star
15

Fraud-Detection-Model-Serving

Online model serving with Fraud Detection model trained with XGBoost on IEEE-CIS dataset
Jupyter Notebook
16
star
16

yatai-deployment

🚀 Launching Bento in a Kubernetes cluster
Go
16
star
17

google-cloud-run-deploy

Fast model deployment on Google Cloud Run
Python
15
star
18

aws-sagemaker-deploy

Fast model deployment on AWS Sagemaker
Python
15
star
19

aws-lambda-deploy

Fast model deployment on AWS Lambda
Python
14
star
20

aws-ec2-deploy

Fast model deployment on AWS EC2
Python
14
star
21

BentoLMDeploy

Self-host LLMs with LMDeploy and BentoML
Python
14
star
22

yatai-image-builder

đŸŗ Build OCI images for Bentos in k8s
Go
14
star
23

sentence-embedding-bento

Sentence Embedding as a Service
Jupyter Notebook
14
star
24

IF-multi-GPUs-demo

Python
12
star
25

openllm-models

Python
10
star
26

BentoSVD

Python
10
star
27

BentoWhisperX

Python
10
star
28

diffusers-examples

API serving for your diffusers models
Python
10
star
29

BentoCLIP

building a CLIP application using BentoML
Python
8
star
30

Pneumonia-Detection-Demo

Pneumonia Detection - Healthcare Imaging Application built with BentoML and fine-tuned Vision Transformer (ViT) model
Python
8
star
31

yatai-chart

Helm Chart for installing Yatai on Kubernetes ⎈
Mustache
7
star
32

benchmark

BentoML Performance Benchmark 🆚
Jupyter Notebook
7
star
33

BentoTRTLLM

Python
6
star
34

plugins

the swish knife to all things bentoml.
Starlark
6
star
35

bentoctl-operator-template

Python
6
star
36

heroku-deploy

Deploy BentoML bundled models to Heroku
Python
6
star
37

quickstart

BentoML Quickstart Example
Python
6
star
38

BentoSentenceTransformers

how to build a sentence embedding application using BentoML
Python
5
star
39

BentoYolo

BentoML service of YOLO v8
Python
5
star
40

google-compute-engine-deploy

HCL
5
star
41

bentoml-core

Rust
5
star
42

BentoControlNet

Python
4
star
43

BentoBark

Python
4
star
44

BentoRAG

Tutorial: Build RAG Apps with Custom Models Served with BentoML
Python
4
star
45

BentoXTTS

how to build an text-to-speech application using BentoML
Python
4
star
46

containerize-push-action

docker's build-and-push-action equivalent for bentoml
TypeScript
4
star
47

BentoBLIP

how to build an image captioning application on top of a BLIP model with BentoML
Python
3
star
48

deploy-bento-action

A GitHub Action to deploy bento to cloud
3
star
49

azure-functions-deploy

Fast model deployment on Azure Functions
Python
3
star
50

azure-container-instances-deploy

Fast model deployment on Azure container instances
Python
3
star
51

BentoFunctionCalling

Python
3
star
52

llm-router

LLM Router Demo
Python
3
star
53

BentoResnet

Python
2
star
54

bentoml-arize-fraud-detection-workshop

Jupyter Notebook
2
star
55

BentoSDXLTurbo

how to build an image generation application using BentoML
Python
2
star
56

BentoSearch

Search with LLM
Python
2
star
57

BentoInfinity

Python
2
star
58

BentoMLCLLM

Python
2
star
59

yatai-schemas

Go
1
star
60

bentoctl-workshops

Python
1
star
61

bentocloud-homepage-news

1
star
62

yatai-common

Go
1
star
63

BentoMoirai

Python
1
star
64

.github

✨🍱đŸĻ„ī¸
1
star
65

bentoml-unsloth

BentoML Unsloth integration
Python
1
star
66

BentoShield

Python
1
star
67

LLMGateway

Python
1
star
68

BentoTGI

Python
1
star
69

openllm-benchmark

Python
1
star