π¨βπ³ Open Source MLOps Stack Recipes
When we first created ZenML as an extensible MLOps framework for creating portable, production-ready MLOps pipelines, we saw many of our users having to deal with the pain of deploying infrastructure from scratch to run these pipelines. The community consistently asked questions like:
- How do I deploy tool X with tool Y?
- Does a combination of tool X with Y make sense?
- Isn't there an easy way to just try these stacks out to make an informed decision?
To address these questions, the ZenML team presents you a series of Terraform-based recipes to quickly provision popular combinations of MLOps tools. These recipes will be useful for you if:
- You are at the start of your MLOps journey, and would like to explore different tools.
- You are looking for guidelines for production-grade deployments.
- You would like to run your pipelines on your chosen ZenML Stack.
If you'd like to learn more, please join our Slack and leave us a message!
πΊ See it in action!
If you're a visual learner or just want to quickly get started with stack recipes, we have a video that might interest you. It introduces the concept of stack recipes, and shows you the entire flow of going from scratch to having a full-fledged MLOps stack that you can run your ZenML pipelines on.
Watch it below:
π List of Recipes
Recipe | Tools installed | Description |
---|---|---|
aws-kubeflow-kserve | Kubeflow on EKS, S3, ECR, MLflow Tracking, KServe | A recipe that creates a Kubeflow pipelines cluster as orchestrator, S3 artifact store, ECR container registry, MLflow experiment tracker and KServe model deployer |
aws-minimal | EKS, S3, ECR, MLflow Tracking, Seldon | AWS specific recipe to showcase a production-grade MLOps Stack with an EKS orchestrator, S3 artifact store, ECR container registry, MLflow experiment tracker and Seldon Core model deployer |
aws-stores-minimal | S3, ECR | A simple recipe to spin up an S3 artifact store and an ECR container registry |
azure-minimal | AKS, Blob Storage, ACR, MLflow Tracking, Key Vault, Seldon | Azure specific recipe that creates an MLOps stack with an AKS cluster as orchestrator, Azure Blob Storage container artifact store. ACR container registry, MLflow experiment tracker and Seldon Core model deployer |
azureml-minimal | AzureML Workspace, Blob Storage, ACR, MLflow Tracking, Key Vault | A recipe that creates a AzureML workspace with MLflow Tracking enabled and adds a compute cluster with access to Blob Storage, and Key Vault |
gcp-airflow | Managed Airflow using GCP Cloud Composer, GCS Bucket, GCR | A recipe that creates a Cloud Composer environment that enables a managed Airflow environment as orchestrator, a GCS artifact store and a GCR container registry |
gcp-kubeflow-kserve | Kubeflow on GKE, GCS Bucket, GCR, MLflow Tracking, KServe, Vertex | A recipe that creates a Kubeflow pipelines cluster as orchestrator, GCS artifact store, GCR container registry, MLflow experiment tracker, KServe model deployer and option for Vertex AI as a step operator |
gcp-minimal | GKE, GCS Bucket, GCR, MLflow Tracking, Seldon | GCP specific recipe to showcase a production-grade MLOps Stack with a GKE orchestrator, GCS artifact store, GCR container repository, MLflow experiment tracker and Seldon Core model deployer |
gcp-vertexai | Vertex AI Pipelines, GCS Bucket, GCR and (optional) MLflow Tracking | A stack with a Vertex AI orchestrator, GCS artifact store, GCR container registry and an optional MLflow experiment tracker |
π Association with ZenML
It is not necessary to use the MLOps stacks recipes presented here alongside the ZenML framework. You can simply use the Terraform scripts directly.
However, ZenML works seamlessly with the infrastructure provisioned through these recipes. The ZenML CLI has an integration with this repository that makes it really simple to pull and deploy these recipes. Detailed steps are available in the READMEs of respective recipes but a simple flow could look like the following:
-
π List the available recipes in the repository.zenml stack recipe list
-
Pull the recipe that you wish to deploy, to your local system.
zenml stack recipe pull <STACK_RECIPE_NAME>
-
π¨ Customize your deployment by editing the default values in thelocals.tf
file. -
π Add your secret information like keys and pass into thevalues.tfvars.json
file which is not committed and only exists locally. -
π Deploy the recipe with this simple command.zenml stack recipe deploy <STACK_RECIPE_NAME>
Note If you want to allow ZenML to automatically import the created resources as a ZenML stack, pass the
--import
flag to the command above. By default, the imported stack will have the same name as the stack recipe and you can provide your own with the--stack-name
option. -
You'll notice that a ZenML stack configuration file gets created after the previous command executes
π€― ! This YAML file can be imported as a ZenML stack manually by running the following command.zenml stack import <STACK_NAME> -f <PATH_TO_THE_CREATED_STACK_CONFIG_YAML>
To learn more about ZenML and how it empowers you to develop a stack-agnostic MLOps solution, head over to the ZenML docs.
βοΈ Using recipes with Terraform
Running a recipe is a matter of two simple commands. You can clone the repository, and for a chosen recipe of your choice execute:
terraform init
terraform apply
Note You need to have credentials for a chosen cloud provider set up before running.
π§βπ³
Create your own recipe If you want to don the chef's hat and create a new recipe to cover your specific use case, we have just the ingredients you need!
-
Check out this video that goes through the process of creating a new recipe by extending recipes that already exist on this repository. It's useful when you want to mix and match components across recipes.
-
Learn more about the design principles behind a recipe, and more information on testing in the CONTRIBUTING.md guide
π Acknowledgements
Thank you to the folks over at Fuzzy Labs for their support and contributions to this repository.
We'd also like to acknowledge some of the cool inspirations for this project: