• Stars
    star
    7,025
  • Rank 5,568 (Top 0.2 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created over 5 years ago
  • Updated about 2 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

The easiest way to serve AI apps and models - Build reliable Inference APIs, LLM apps, Multi-model chains, RAG service, and much more!
bentoml

BentoML: The Unified AI Application Framework

pypi_status CI Twitter Community

BentoML is a framework for building reliable, scalable, and cost-efficient AI applications. It comes with everything you need for model serving, application packaging, and production deployment.

👉 Join our Slack community!

Highlights

🍱 Bento is the container for AI apps

  • Open standard and SDK for AI apps, pack your code, inference pipelines, model files, dependencies, and runtime configurations in a Bento.
  • Auto-generate API servers, supporting REST API, gRPC, and long-running inference jobs.
  • Auto-generate Docker container images.

🏄 Freedom to build with any AI models

🍭 Simplify modern AI application architecture

🚀 Deploy Anywhere

  • One-click deployment to ☁️ BentoCloud, the Serverless platform made for hosting and operating AI apps.
  • Scalable BentoML deployment with 🦄️ Yatai on Kubernetes.
  • Deploy auto-generated container images anywhere docker runs.

Documentation

🛠️ What you can build with BentoML

Getting Started

Save or import models in BentoML local model store:

import bentoml
import transformers

pipe = transformers.pipeline("text-classification")

bentoml.transformers.save_model(
  "text-classification-pipe",
  pipe,
  signatures={
    "__call__": {"batchable": True}  # Enable dynamic batching for model
  }
)

View all models saved locally:

$ bentoml models list

Tag                                     Module                Size        Creation Time
text-classification-pipe:kn6mr3aubcuf…  bentoml.transformers  256.35 MiB  2023-05-17 14:36:25

Define how your model runs in a service.py file:

import bentoml

model_runner = bentoml.models.get("text-classification-pipe").to_runner()

svc = bentoml.Service("text-classification-service", runners=[model_runner])

@svc.api(input=bentoml.io.Text(), output=bentoml.io.JSON())
async def classify(text: str) -> str:
    results = await model_runner.async_run([text])
    return results[0]

Now, run the API service locally:

bentoml serve service.py:svc

Sent a prediction request:

$ curl -X POST -H "Content-Type: text/plain" --data "BentoML is awesome" http://localhost:3000/classify

{"label":"POSITIVE","score":0.9129443168640137}%

Define how a Bento can be built for deployment, with bentofile.yaml:

service: 'service.py:svc'
name: text-classification-svc
include:
  - 'service.py'
python:
  packages:
  - torch>=2.0
  - transformers

Build a Bento and generate a docker image:

$ bentoml build
...
Successfully built Bento(tag="text-classification-svc:mc322vaubkuapuqj").
$ bentoml containerize text-classification-svc
Building OCI-compliant image for text-classification-svc:mc322vaubkuapuqj with docker
...
Successfully built Bento container for "text-classification-svc" with tag(s) "text-classification-svc:mc322vaubkuapuqj"
$ docker run -p 3000:3000 text-classification-svc:mc322vaubkuapuqj

For a more detailed user guide, check out the BentoML Tutorial.


Community

BentoML supports billions of model runs per day and is used by thousands of organizations around the globe.

Join our Community Slack 💬, where thousands of AI application developers contribute to the project and help each other.

To report a bug or suggest a feature request, use GitHub Issues.

Contributing

There are many ways to contribute to the project:

  • Report bugs and "Thumbs up" on issues that are relevant to you.
  • Investigate issues and review other developers' pull requests.
  • Contribute code or documentation to the project by submitting a GitHub pull request.
  • Check out the Contributing Guide and Development Guide to learn more
  • Share your feedback and discuss roadmap plans in the #bentoml-contributors channel here.

Thanks to all of our amazing contributors!


Usage Reporting

BentoML collects usage data that helps our team to improve the product. Only BentoML's internal API calls are being reported. We strip out as much potentially sensitive information as possible, and we will never collect user code, model data, model names, or stack traces. Here's the code for usage tracking. You can opt-out of usage tracking by the --do-not-track CLI option:

bentoml [command] --do-not-track

Or by setting environment variable BENTOML_DO_NOT_TRACK=True:

export BENTOML_DO_NOT_TRACK=True

License

Apache License 2.0

FOSSA Status

Citation

If you use BentoML in your research, please cite using the following [citation](./CITATION.cff:

@software{Yang_BentoML_The_framework,
author = {Yang, Chaoyu and Sheng, Sean and Pham, Aaron and  Zhao, Shenyang and Lee, Sauyon and Jiang, Bo and Dong, Fog and Guan, Xipeng and Ming, Frost},
license = {Apache-2.0},
title = {{BentoML: The framework for building reliable, scalable and cost-efficient AI application}},
url = {https://github.com/bentoml/bentoml}
}

More Repositories

1

OpenLLM

Run any open-source LLMs, such as Llama 3.1, Gemma, as OpenAI compatible API endpoint in the cloud.
Python
9,813
star
2

Yatai

Model Deployment at Scale on Kubernetes 🦄️
TypeScript
788
star
3

BentoDiffusion

BentoDiffusion: A collection of diffusion models served with BentoML
Python
331
star
4

stable-diffusion-server

Deploy Your Own Stable Diffusion Service
Python
196
star
5

bentoctl

Fast model deployment on any cloud 🚀
Python
175
star
6

gallery

BentoML Example Projects 🎨
Python
134
star
7

BentoVLLM

Self-host LLMs with vLLM and BentoML
Python
64
star
8

OCR-as-a-Service

Turn any OCR models into online inference API endpoint 🚀 🌖
Python
49
star
9

CLIP-API-service

CLIP as a service - Embed image and sentences, object recognition, visual reasoning, image classification and reverse image search
Jupyter Notebook
48
star
10

transformers-nlp-service

Online Inference API for NLP Transformer models - summarization, text classification, sentiment analysis and more
Python
43
star
11

llm-bench

Python
28
star
12

rag-tutorials

a series of tutorials implementing rag service with BentoML and LlamaIndex
Python
23
star
13

simple_di

Simple dependency injection framework for Python
Python
21
star
14

BentoChatTTS

Python
21
star
15

Fraud-Detection-Model-Serving

Online model serving with Fraud Detection model trained with XGBoost on IEEE-CIS dataset
Jupyter Notebook
16
star
16

yatai-deployment

🚀 Launching Bento in a Kubernetes cluster
Go
16
star
17

google-cloud-run-deploy

Fast model deployment on Google Cloud Run
Python
15
star
18

aws-sagemaker-deploy

Fast model deployment on AWS Sagemaker
Python
15
star
19

aws-lambda-deploy

Fast model deployment on AWS Lambda
Python
14
star
20

aws-ec2-deploy

Fast model deployment on AWS EC2
Python
14
star
21

BentoLMDeploy

Self-host LLMs with LMDeploy and BentoML
Python
14
star
22

yatai-image-builder

🐳 Build OCI images for Bentos in k8s
Go
14
star
23

sentence-embedding-bento

Sentence Embedding as a Service
Jupyter Notebook
14
star
24

IF-multi-GPUs-demo

Python
12
star
25

openllm-models

Python
10
star
26

BentoSVD

Python
10
star
27

BentoWhisperX

Python
10
star
28

diffusers-examples

API serving for your diffusers models
Python
10
star
29

BentoCLIP

building a CLIP application using BentoML
Python
8
star
30

Pneumonia-Detection-Demo

Pneumonia Detection - Healthcare Imaging Application built with BentoML and fine-tuned Vision Transformer (ViT) model
Python
8
star
31

yatai-chart

Helm Chart for installing Yatai on Kubernetes ⎈
Mustache
7
star
32

benchmark

BentoML Performance Benchmark 🆚
Jupyter Notebook
7
star
33

BentoTRTLLM

Python
6
star
34

plugins

the swish knife to all things bentoml.
Starlark
6
star
35

bentoctl-operator-template

Python
6
star
36

heroku-deploy

Deploy BentoML bundled models to Heroku
Python
6
star
37

quickstart

BentoML Quickstart Example
Python
6
star
38

BentoSentenceTransformers

how to build a sentence embedding application using BentoML
Python
5
star
39

BentoYolo

BentoML service of YOLO v8
Python
5
star
40

google-compute-engine-deploy

HCL
5
star
41

bentoml-core

Rust
5
star
42

BentoControlNet

Python
4
star
43

BentoBark

Python
4
star
44

BentoRAG

Tutorial: Build RAG Apps with Custom Models Served with BentoML
Python
4
star
45

BentoXTTS

how to build an text-to-speech application using BentoML
Python
4
star
46

containerize-push-action

docker's build-and-push-action equivalent for bentoml
TypeScript
4
star
47

BentoBLIP

how to build an image captioning application on top of a BLIP model with BentoML
Python
3
star
48

deploy-bento-action

A GitHub Action to deploy bento to cloud
3
star
49

azure-functions-deploy

Fast model deployment on Azure Functions
Python
3
star
50

azure-container-instances-deploy

Fast model deployment on Azure container instances
Python
3
star
51

BentoFunctionCalling

Python
3
star
52

llm-router

LLM Router Demo
Python
3
star
53

BentoResnet

Python
2
star
54

bentoml-arize-fraud-detection-workshop

Jupyter Notebook
2
star
55

BentoSDXLTurbo

how to build an image generation application using BentoML
Python
2
star
56

BentoSearch

Search with LLM
Python
2
star
57

BentoInfinity

Python
2
star
58

BentoMLCLLM

Python
2
star
59

yatai-schemas

Go
1
star
60

bentoctl-workshops

Python
1
star
61

bentocloud-homepage-news

1
star
62

yatai-common

Go
1
star
63

BentoMoirai

Python
1
star
64

.github

✨🍱🦄️
1
star
65

bentoml-unsloth

BentoML Unsloth integration
Python
1
star
66

BentoShield

Python
1
star
67

LLMGateway

Python
1
star
68

BentoTGI

Python
1
star
69

openllm-benchmark

Python
1
star