Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to deploy local MLflow model to Minikube? #5750

Open
arodindev opened this issue Jul 12, 2024 · 2 comments
Open

How to deploy local MLflow model to Minikube? #5750

arodindev opened this issue Jul 12, 2024 · 2 comments

Comments

@arodindev
Copy link

For testing purposes I am using my local file storage for storing MLflow artifact folder and not a remote storage such as S3 or GS. For that im creating a PVC and reference it to the modelUri parameter.

My yaml looks as follows:

apiVersion: v1
kind: PersistentVolume
metadata:
  name: iris-model-pv
  namespace: seldon-system
spec:
  capacity:
    storage: 1Gi
  accessModes:
    - ReadWriteMany
  hostPath:
    path: "/home/USER/mlflow/backend/artifacts/2/f64b3afb30a148acadc3f301310ce673/artifacts/iris-model"
---
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
  name: iris-model-pvc
  namespace: seldon-system
spec:
  accessModes:
    - ReadWriteMany
  resources:
    requests:
      storage: 1Gi
---
apiVersion: machinelearning.seldon.io/v1alpha2
kind: SeldonDeployment
metadata:
  name: iris-model-deployment
  namespace: seldon-system
spec:
  name: iris-model
  predictors:
  - graph:
      children: []
      implementation: MLFLOW_SERVER
      modelUri: pvc://iris-model-pvc/
      name: iris-model
    name: default
    replicas: 1

The hostPath is pointing to the MLflow folder which contains the artifacts MLmodel, model.pkl, etc. However, the container iris-model is throwing the following error:

Executing before-run script
---> Creating environment with Conda...
INFO:root:Copying contents of /mnt/models to local
INFO:root:Reading MLmodel file
Traceback (most recent call last):
  File "./conda_env_create.py", line 153, in <module>
    main(args)
  File "./conda_env_create.py", line 148, in main
    setup_env(model_folder)
  File "./conda_env_create.py", line 46, in setup_env
    mlmodel = read_mlmodel(model_folder)
  File "./conda_env_create.py", line 75, in read_mlmodel
    return _read_yaml(mlmodel_path)
  File "./conda_env_create.py", line 91, in _read_yaml
    with open(file_path, "r") as file:
FileNotFoundError: [Errno 2] No such file or directory: '/mnt/models/MLmodel'

The iris-model-model-initializer container completes, but did not transfer anything:

NOTICE: Config file "/.rclone.conf" not found - using defaults
INFO  : There was nothing to transfer
INFO  :
Transferred:                 0 B / 0 B, -, 0 B/s, ETA -
Elapsed time:         0.0s

I would appreciate support to get this to work.

@vidya4499
Copy link

@arodindev i can help on this. Is the issue still exists ?

@arodindev
Copy link
Author

@vidya4499 not really, but feel free to post any solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants