Model Artifacts

SklearnModelArtifact

class bentoml.artifact.SklearnModelArtifact(name, pickle_extension='.pkl')

Abstraction for saving/loading scikit learn models using sklearn.externals.joblib

Parameters
  • name (str) – Name for the artifact

  • pickle_extension (str) – The extension format for pickled file

Raises

MissingDependencyException – sklean package is required for SklearnModelArtifact

Example usage:

>>> from sklearn import svm
>>>
>>> model_to_save = svm.SVC(gamma='scale')
>>> # ... training model, etc.
>>>
>>> import bentoml
>>> from bentoml.artifact import SklearnModelArtifact
>>> from bentoml.adapters import DataframeInput
>>>
>>> @bentoml.env(auto_pip_dependencies=True)
>>> @bentoml.artifacts([SklearnModelArtifact('model')])
>>> class SklearnModelService(bentoml.BentoService):
>>>
>>>     @bentoml.api(input=DataframeInput())
>>>     def predict(self, df):
>>>         result = self.artifacts.model.predict(df)
>>>         return result
>>>
>>> svc = SklearnModelService()
>>>
>>> # Pack directly with sklearn model object
>>> svc.pack('model', model_to_save)

PytorchModelArtifact

class bentoml.artifact.PytorchModelArtifact(name, file_extension='.pt')

Abstraction for saving/loading objects with torch.save and torch.load

Parameters

name (string) – name of the artifact

Raises
  • MissingDependencyException – torch package is required for PytorchModelArtifact

  • InvalidArgument – invalid argument type, model being packed must be instance of torch.nn.Module

Example usage:

>>> import torch.nn as nn
>>>
>>> class Net(nn.Module):
>>>     def __init__(self):
>>>         super(Net, self).__init__()
>>>         ...
>>>
>>>     def forward(self, x):
>>>         ...
>>>
>>> net = Net()
>>> # Train model with data
>>>
>>>
>>> import bentoml
>>> from bentoml.adapters import ImageInput
>>> from bentoml.artifact import PytorchModelArtifact
>>>
>>> @bentoml.env(auto_pip_dependencies=True)
>>> @bentoml.artifacts([PytorchModelArtifact('net')])
>>> class PytorchModelService(bentoml.BentoService):
>>>
>>>     @bentoml.api(input=ImageInput())
>>>     def predict(self, imgs):
>>>         outputs = self.artifacts.net(imgs)
>>>         return outputs
>>>
>>>
>>> svc = PytorchModelService()
>>>
>>> # Pytorch model can be packed directly.
>>> svc.pack('net', net)

KerasModelArtifact

class bentoml.artifact.KerasModelArtifact(name, custom_objects=None, model_extension='.h5', store_as_json_and_weights=False)

Abstraction for saving/loading Keras model

Parameters
  • name (string) – name of the artifact

  • custom_objects (dict) – dictionary of Keras custom objects for model

  • store_as_json_and_weights (bool) – flag allowing storage of the Keras model as JSON and weights

Raises
  • MissingDependencyException – keras or tensorflow.keras package is required for KerasModelArtifact

  • InvalidArgument – invalid argument type, model being packed must be instance of keras.engine.network.Network, tf.keras.models.Model, or their aliases

Example usage:

>>> from tensorflow import keras
>>> from tensorflow.keras.models import Sequential
>>> from tensorflow.keras.preprocessing import sequence, text
>>>
>>> model_to_save = Sequential()
>>> # traing model
>>> model_to_save.compile(...)
>>> model_to_save.fit(...)
>>>
>>> import bentoml
>>>
>>> @bentoml.env(pip_dependencies=['tensorflow==1.14.0', 'numpy', 'pandas'])
>>> @bentoml.artifacts([KerasModelArtifact('model')])
>>> class KerasModelService(bentoml.BentoService):
>>>     @bentoml.api(input=JsonInput())
>>>     def predict(self, parsed_json):
>>>         input_data = text.text_to_word_sequence(parsed_json['text'])
>>>         return self.artifacts.model.predict_classes(input_data)
>>>
>>> svc = KerasModelService()
>>> svc.pack('model', model_to_save)

FastaiModelArtifact

class bentoml.artifact.FastaiModelArtifact(name)

Saving and Loading FastAI Model

Parameters

name (str) – Name for the fastai model

Raises
  • MissingDependencyException – Require fastai package to use Fast ai model artifact

  • InvalidArgument – invalid argument type, model being packed must be instance of fastai.basic_train.Learner

Example usage:

>>> from fastai.tabular import *
>>>
>>> # prepare data
>>> data = TabularList.from_df(...)
>>> learn = tabular_learner(data, ...)
>>> # train model
>>>
>>> import bentoml
>>> from bentoml.adapters import DataframeInput
>>> from bentoml.artifact import FastaiModelArtifact
>>>
>>> @bentoml.artifacts([FastaiModelArtifact('model')])
>>> @bentoml.env(auto_pip_dependencies=True)
>>> class FastaiModelService(bentoml.BentoService):
>>>
>>>     @api(input=DataframeInput())
>>>     def predict(self, df):
>>>         results = []
>>>         for _, row in df.iterrows():
>>>             prediction = self.artifacts.model.predict(row)
>>>             results.append(prediction[0].obj)
>>>         return results
>>>
>>> svc = FastaiModelService()
>>>
>>> # Pack fastai basic_learner directly
>>> svc.pack('model', learn)

TensorflowSavedModelArtifact

class bentoml.artifact.TensorflowSavedModelArtifact(name)

Abstraction for saving/loading Tensorflow model in tf.saved_model format

Parameters

name (string) – name of the artifact

Raises

MissingDependencyException – tensorflow package is required for TensorflowSavedModelArtifact

Example usage:

>>> import tensorflow as tf
>>>
>>> # Option 1: custom model with specific method call
>>> class Adder(tf.Module):
>>>     @tf.function(input_signature=[tf.TensorSpec(shape=None, dtype=tf.float32)])
>>>     def add(self, x):
>>>         return x + x + 1.
>>> model_to_save = Adder()
>>> # ... compiling, training, etc
>>>
>>> # Option 2: Sequential model (direct call only)
>>> model_to_save = tf.keras.Sequential([
>>>     tf.keras.layers.Flatten(input_shape=(28, 28)),
>>>     tf.keras.layers.Dense(128, activation='relu'),
>>>     tf.keras.layers.Dense(10, activation='softmax')
>>> ])
>>> # ... compiling, training, etc
>>>
>>> import bentoml
>>> from bentoml.adapters import JsonInput
>>> from bentoml.artifact import TensorflowSavedModelArtifact
>>>
>>> @bentoml.env(pip_dependencies=["tensorflow"])
>>> @bentoml.artifacts([TensorflowSavedModelArtifact('model')])
>>> class TfModelService(bentoml.BentoService):
>>>
>>>     @bentoml.api(input=JsonInput())
>>>     def predict(self, json):
>>>         input_data = json['input']
>>>         prediction = self.artifacts.model.add(input_data)
>>>         # prediction = self.artifacts.model(input_data)  # if Sequential mode
>>>         return prediction.numpy()
>>>
>>> svc = TfModelService()
>>>
>>> # Option 1: pack directly with Tensorflow trackable object
>>> svc.pack('model', model_to_save)
>>>
>>> # Option 2: save to file path then pack
>>> tf.saved_model.save(model_to_save, '/tmp/adder/1')
>>> svc.pack('model', '/tmp/adder/1')

XgboostModelArtifact

class bentoml.artifact.XgboostModelArtifact(name, model_extension='.model')

Abstraction for save/load object with Xgboost.

Parameters
  • name (string) – name of the artifact

  • model_extension (string) – Extension name for saved xgboost model

Raises
  • ImportError – xgboost package is required for using XgboostModelArtifact

  • TypeError – invalid argument type, model being packed must be instance of xgboost.core.Booster

Example usage:

>>> import xgboost
>>>
>>> # prepare data
>>> params = {... params}
>>> dtrain = xgboost.DMatrix(...)
>>>
>>> # train model
>>> model_to_save = xgboost.train(params=params, dtrain=dtrain)
>>>
>>> import bentoml
>>> from bentoml.artifact import XgboostModelArtifact
>>> from bentoml.adapters import DataframeInput
>>>
>>> @bentoml.env(auto_pip_dependencies=True)
>>> @bentoml.artifacts(XgboostModelArtifact('model'))
>>> class XGBoostModelService(bentoml.BentoService):
>>>
>>>     @bentoml.api(input=DataframeInput())
>>>     def predict(self, df):
>>>         result = self.artifacts.model.predict(df)
>>>         return result
>>>
>>> svc = XGBoostModelService()
>>> # Pack xgboost model
>>> svc.pack('model', model_to_save)

LightGBMModelArtifact

class bentoml.artifact.LightGBMModelArtifact(name, model_extension='.txt')

Abstraction for save/load object with LightGBM.

Parameters
  • name (string) – name of the artifact

  • model_extension (string) – Extension name for saved xgboost model

Raises
  • MissingDependencyException – lightgbm package is required for using LightGBMModelArtifact

  • InvalidArgument – invalid argument type, model being packed must be instance of lightgbm.Booster

Example usage:

>>> import lightgbm as lgb
>>> # Prepare data
>>> train_data = lgb.Dataset(...)
>>> # train model
>>> model = lgb.train(train_set=train_data, ...)
>>>
>>> import bentoml
>>> from bentoml.artifact import LightGBMModelArtifact
>>> from bentoml.adapters import DataframeInput
>>>
>>> @bentoml.artifacts([LightGBMModelArtifact('model')])
>>> @bentoml.env(auto_pip_dependencies=True)
>>> class LgbModelService(bentoml.BentoService):
>>>
>>>     @bentoml.api(input=DataframeInput())
>>>     def predict(self, df):
>>>         return self.artifacts.model.predict(df)
>>>
>>> svc = LgbModelService()
>>> svc.pack('model', model)

OnnxModelArtifact

class bentoml.artifact.OnnxModelArtifact(name, backend='onnxruntime')

Abstraction for saving/loading onnx model

Parameters
  • name (string) – Name of the artifact

  • backend (string) – Name of ONNX inference runtime. [onnx]

Raises
  • MissingDependencyException – onnx package is required for packing a ModelProto object

  • NotImplementedError – {backend} as onnx runtime is not supported at the moment

Example usage: >>> >>> # Train a model. >>> from sklearn.datasets import load_iris >>> from sklearn.model_selection import train_test_split >>> from sklearn.ensemble import RandomForestClassifier >>> iris = load_iris() >>> X, y = iris.data, iris.target >>> X_train, X_test, y_train, y_test = train_test_split(X, y) >>> clr = RandomForestClassifier() >>> clr.fit(X_train, y_train)

>>> # Convert into ONNX format
>>> from skl2onnx import convert_sklearn
>>> from skl2onnx.common.data_types import FloatTensorType
>>> initial_type = [('float_input', FloatTensorType([None, 4]))]
>>>
>>> onnx_model = convert_sklearn(clr, initial_types=initial_type)
>>> with open("rf_iris.onnx", "wb") as f:
>>>     f.write(onnx_model.SerializeToString())
>>>
>>>
>>> import numpy
>>> import bentoml
>>> from bentoml.artifact import OnnxModelArtifact
>>> from bentoml.adapters import DataframeInput
>>>
>>> @bentoml.env(auto_pip_dependencies=True)
>>> @bentoml.artifacts([OnnxModelArtifact('model', backend='onnxruntime')])
>>> class OnnxIrisClassifierService(bentoml.BentoService):
>>>     @bentoml.api(input=DataframeInput())
>>>     def predict(self, df):
>>>         input_data = df.to_numpy().astype(numpy.float32
>>>         input_name = self.artifacts.model.get_inputs()[0].name
>>>         output_name = self.artifacts.model.get_outputs()[0].name
>>>         return self.artifacts.model.run(
>>>                     [output_name], {input_name: input_data}
>>>                )[0]
>>>
>>> svc = OnnxIrisClassifierService()
>>>
>>> # Option one: pack with path to model on local system
>>> svc.pack('model', './rf_iris.onnx')
>>>
>>> # Option two: pack with ONNX model object
>>> # svc.pack('model', onnx_model)
>>>
>>> # Save BentoService
>>> svc.save()

H2oModelArtifact

class bentoml.artifact.H2oModelArtifact(name)

Abstraction for saving/loading objects with h2o.save_model and h2o.load_model

Parameters

name (str) – Name for this h2o artifact..

Raises

MissingDependencyException – h2o package is required to use H2o model artifact

Example usage:

>>> import h2o
>>> h2o.init()
>>>
>>> from h2o.estimators.deeplearning import H2ODeepLearningEstimator
>>> model_to_save = H2ODeepLearningEstimator(...)
>>> # train model with data
>>> data = h2o.import_file(...)
>>> model_to_save.train(...)
>>>
>>> import bentoml
>>> from bentoml.artifact import H2oModelArtifact
>>> from bentoml.adapters import DataframeInput
>>>
>>> @bentoml.artifacts([H2oModelArtifact('model')])
>>> @bentoml.env(auto_pip_dependencies=True)
>>> class H2oModelService(bentoml.BentoService):
>>>
>>>     @bentoml.api(input=DataframeInput())
>>>     def predict(self, df):
>>>         hf = h2o.H2OFrame(df)
>>>         predictions = self.artifacts.model.predict(hf)
>>>         return predictions.as_data_frame()
>>>
>>> svc = H2oModelService()
>>>
>>> svc.pack('model', model_to_save)

PickleArtifact

class bentoml.artifact.PickleArtifact(name, pickle_module=<module 'bentoml.utils.cloudpickle' from '/home/docs/checkouts/readthedocs.org/user_builds/bentoml/checkouts/latest/bentoml/utils/cloudpickle.py'>, pickle_extension='.pkl')

Abstraction for saving/loading python objects with pickle serialization

Parameters
  • name (str) – Name for the artifact

  • pickle_module (module|str) – The python module will be used for pickle and unpickle artifact, default pickle module in BentoML’s fork of cloudpickle, which is identical to the Apache Spark fork

  • pickle_extension (str) – The extension format for pickled file.

TextFileArtifact

class bentoml.artifact.TextFileArtifact(name, file_extension='.txt', encoding='utf-8')

Abstraction for saving/loading string to/from text files

Parameters
  • name (str) – Name of the artifact

  • file_extension (str, optional) – The file extention used for the saved text file. Defaults to “.txt”

  • encoding (str) – The encoding will be used for saving/loading text. Defaults to “utf8”