Model and Bento Management

BentoML allows you to store models and bentos in local as well as remote repositories. Tools are also provided to easily manage the lifecycle of these artifacts. This documentation details the cli tools for both local and remote scenarios

Managing Models Locally

Creating Models

Recall the Getting Started guide, models are saved using the framework specific save() function. In the example, we used the save() function from the sklearn module for the Scikit Learn framework.

import bentoml.sklearn
bentoml.sklearn.save("iris_classifier_model", clf)

Models can also be imported from support framework specific registries. In the example below, a model is imported from the MLFlow Model Registry.

import bentoml.mlflow
bentoml.mlflow.import_from_uri("mlflow_model", uri=mlflow_registry_uri)

Saved and imported models are added to the local file system based model store located in the $HOME/bentoml/models directory by default. In order to see what types of model creation is supported per framework, please visit our Frameworks section.

Listing Models

To list all the models created, use either the list() Python function in the bentoml.models modules or the models list CLI command:

import bentoml.models

bentoml.models.list() # get a list of all models
# [
#   {
#     tag: Tag("iris_classifier_model", "vkorlosfifi6zhqqvtpeqaare"),
#     framework: "SKLearn",
#     created: 2021/11/14 03:55:11
#   },
#    {
#     tag: Tag("iris_classifier_model", "vlqdohsfifi6zhqqvtpeqaare"),
#     framework: "SKLearn",
#     created: 2021/11/14 03:55:15
#   },
#   {
#     tag: Tag("iris_classifier_model", "vmiqwpcfifi6zhqqvtpeqaare"),
#     framework: "SKLearn",
#     created: 2021/11/14 03:55:25
#   },
#   {
#     tag: Tag("fraud_detection_model", "5v4pdccfifi6zhqqvtpeqaare"),
#     framework: "PyTorch",
#     created: 2021/11/14 03:57:01
#   },
#   {
#     tag: Tag("fraud_detection_model", "5xorursfifi6zhqqvtpeqaare"),
#     framework: "PyTorch",
#     created: 2021/11/14 03:57:45
#   },
# ]
bentoml.models.list("iris_classifier_model") # get a list of all versions of a specific model
bentoml.models.list(Tag("iris_classifier_model", None))
# [
#   {
#     tag: Tag("iris_classifier_model", "vkorlosfifi6zhqqvtpeqaare"),
#     framework: "SKLearn",
#     created: 2021/11/14 03:55:11
#   },
#    {
#     tag: Tag("iris_classifier_model", "vlqdohsfifi6zhqqvtpeqaare"),
#     framework: "SKLearn",
#     created: 2021/11/14 03:55:15
#   },
#   {
#     tag: Tag("iris_classifier_model", "vmiqwpcfifi6zhqqvtpeqaare"),
#     framework: "SKLearn",
#     created: 2021/11/14 03:55:25
#   },
# ]

To get model information, use either the get() function under the bentoml.models module or the models get CLI command.

import bentoml.models

bentoml.models.get("iris_classifier_model:vmiqwpcfifi6zhqqvtpeqaare")
bentoml.models.get(Tag("iris_classifier_model", "vmiqwpcfifi6zhqqvtpeqaare"))
# Model(
#   tag: Tag("iris_classifier_model", "vmiqwpcfifi6zhqqvtpeqaare"),
#   framework: "SKLearn",
#   created: 2021/11/14 03:55:25
#   description: "The iris classifier model"
#   path: "/user/home/bentoml/models/iris_classifier_model/vmiqwpcfifi6zhqqvtpeqaare"
# )

Deleting Models

To delete models in the model store, use either the delete() function under the bentoml.models module or the models delete CLI command.

import bentoml.models

bentoml.models.delete("iris_classifier_model:vmiqwpcfifi6zhqqvtpeqaare", skip_confirm=True)

Managing Bentos Locally

Creating Bentos

Bentos are created through the bento build process. Recall the Getting Started guide, bentos are built with the build CLI command. See Building Bentos for more details. Built bentos are added to the local file system based bento store located under the $HOME/bentoml/bentos by default.

> bentoml build ./bento.py:svc

Listing Bentos

To view bentos in the bento store, use the list CLI command.

> bentoml list
BENTO                   VERSION                    LABELS      CREATED
iris_classifier_service v5mgcacfgzi6zdz7vtpeqaare  iris,prod   2021/09/19 10:15:50

Deleting Bentos

To delete bentos in the bento store, use the delete CLI command.

> bentoml delete iris_classifier_service:v5mgcacfgzi6zdz7vtpeqaare

Managing Models and Bentos Remotely with Yatai

Yatai is BentoML’s end to end deployment and monitoring platform. It also functions as a remote model and bento repository. To connect the CLI to a remote Yatai <yatai-service-page>, use the bentoml login command.

> bentoml login <YATAI_URL>

Once logged in, you’ll be able to use the following commands.

Pushing Models

Once you are happy with a model and ready to share with other collaborators, you can upload it to a remote Yatai <yatai-service-page> model store with the push() function under the bentoml.models module or the models push CLI command.

import bentoml.models

bentoml.models.push("iris_classifier_model:vmiqwpcfifi6zhqqvtpeqaare", skip_confirm=True)

Pulling Models

Previously pushed models can be downloaded from Yatai <yatai-service-page> and saved local model store with the pull() function under the bentoml.models module or the models pull CLI command.

import bentoml.models

bentoml.modles.pull("iris_classifier_model:vmiqwpcfifi6zhqqvtpeqaare", url=yatai_url)

Pushing Bentos

To upload bento in the local file system store to a remote Yatai <yatai-service-page> bento store for collaboration and deployment, use the push CLI command.

> bentoml push iris_classifier_service:v5mgcacfgzi6zdz7vtpeqaare

Pulling Bentos

To download a bento from a remote Yatai <yatai-service-page> bento store to the local file system bento store for troubleshooting, use the pull CLI command.

> bentoml pull iris_classifier_service:v5mgcacfgzi6zdz7vtpeqaare

Further Reading

  • Install Yatai

  • Yatai System Admin Guide

Todo

Complete and link the further reading documentation