MLflow On-Premise Deployment using Docker Compose
Easily deploy an MLflow tracking server with 1 command.
MinIO S3 is used as the artifact store and MySQL server is used as the backend store.
How to run
-
Clone (download) this repository
git clone https://github.com/sachua/mlflow-docker-compose.git
-
cd
into themlflow-docker-compose
directory -
Build and run the containers with
docker-compose
docker-compose up -d --build
-
Access MLflow UI with http://localhost:5000
-
Access MinIO UI with http://localhost:9000
Containerization
The MLflow tracking server is composed of 4 docker containers:
- MLflow server
- MinIO object storage server
- MySQL database server
Example
-
Install conda
-
Install MLflow with extra dependencies, including scikit-learn
pip install mlflow[extras]
-
Set environmental variables
export MLFLOW_TRACKING_URI=http://localhost:5000 export MLFLOW_S3_ENDPOINT_URL=http://localhost:9000
-
Set MinIO credentials
cat <<EOF > ~/.aws/credentials [default] aws_access_key_id=minio aws_secret_access_key=minio123 EOF
-
Train a sample MLflow model
mlflow run https://github.com/mlflow/mlflow-example.git -P alpha=0.42
-
Note: To fix ModuleNotFoundError: No module named 'boto3'
#Switch to the conda env conda env list conda activate mlflow-3eee9bd7a0713cf80a17bc0a4d659bc9c549efac #replace with your own generated mlflow-environment pip install boto3
-
-
Serve the model (replace with your model's actual path)
mlflow models serve -m S3://mlflow/0/98bdf6ec158145908af39f86156c347f/artifacts/model -p 1234
-
You can check the input with this command
curl -X POST -H "Content-Type:application/json; format=pandas-split" --data '{"columns":["alcohol", "chlorides", "citric acid", "density", "fixed acidity", "free sulfur dioxide", "pH", "residual sugar", "sulphates", "total sulfur dioxide", "volatile acidity"],"data":[[12.8, 0.029, 0.48, 0.98, 6.2, 29, 3.33, 1.2, 0.39, 75, 0.66]]}' http://127.0.0.1:1234/invocations