Picklable Model#

Here’s an example of saving any Python object or function as model, and create a runner instance:

import bentoml

class MyPicklableModel:
    def predict(self, some_integer: int):
        return some_integer ** 2

# `save` a given model or function
model = MyPicklableModel()
tag = bentoml.picklable_model.save_model(
    'mypicklablemodel',
    model,
    signatures={"predict": {"batchable": False}}
)

# retrieve metadata with `bentoml.models.get`:
metadata = bentoml.picklable_model.get(tag)

# load the model back:
loaded = bentoml.picklable_model.load_model("mypicklablemodel:latest")

# Run a given model under `Runner` abstraction with `load_runner`
runner = bentoml.picklable_model.get(tag).to_runner()
runner.init_local()
runner.predict.run(7)
bentoml.picklable_model.save_model(name, model, *, signatures=None, labels=None, custom_objects=None, metadata=None)[source]#

Save a model instance to BentoML modelstore.

Parameters
  • name (str) – Name for given model instance. This should pass Python identifier check.

  • model (Union[BaseEstimator, Pipeline]) – Instance of model to be saved.

  • ( (signatures) –

    code: Dict[str, ModelSignatureDict]) Methods to expose for running inference on the target model. Signatures are

    used for creating Runner instances when serving model with bentoml.Service

  • labels (Dict[str, str], optional, default to None) – user-defined labels for managing models, e.g. team=nlp, stage=dev

  • custom_objects (Dict[str, Any]], optional, default to None) –

    user-defined additional python objects to be saved alongside the model,

    e.g. a tokenizer instance, preprocessor function, model configuration json

  • metadata (Dict[str, Any], optional, default to None) – Custom metadata for given model.

Returns

A tag with a format name:version where name is the user-defined model’s name, and a generated version.

Return type

Tag

Examples:

import bentoml

bento_model = bentoml.picklable.save_model('picklable_pyobj', model)
bentoml.picklable_model.load_model(bento_model)[source]#

Load the picklable model with the given tag from the local BentoML model store.

Parameters
  • bento_model (str | Tag | Model) – Either the tag of the model to get from the store, or a BentoML ~bentoml.Model instance to load the model from.

  • ...

Returns

object

The picklable model loaded from the model store or BentoML Model.

Return type

ModelType

Example: .. code-block:: python

import bentoml

picklable_model = bentoml.picklable_model.load_model(‘my_model:latest’)

bentoml.picklable_model.get(tag_like)[source]#