Frameworks

Here are the all of the supported ML frameworks for BentoML. You can find the official BentoML example projects in the bentoml/gallery repository, group by the ML training frameworks used in the project.

You can download the examples below and run them on your computer. Links to run them on Google Colab are also available, although some of the features demo’d in the notebooks does not work in the Colab environment due to its limitations, including running the BentoML API model server, building docker image or creating cloud deployment.

Scikit-Learn

Example Projects:

class bentoml.artifact.SklearnModelArtifact(name, pickle_extension='.pkl')

Abstraction for saving/loading scikit learn models using sklearn.externals.joblib

Parameters
  • name (str) – Name for the artifact

  • pickle_extension (str) – The extension format for pickled file

Raises

MissingDependencyException – sklean package is required for SklearnModelArtifact

Example usage:

>>> from sklearn import svm
>>>
>>> model_to_save = svm.SVC(gamma='scale')
>>> # ... training model, etc.
>>>
>>> import bentoml
>>> from bentoml.artifact import SklearnModelArtifact
>>> from bentoml.handlers import DataframeHandler
>>>
>>> @bentoml.env(auto_pip_dependencies=True)
>>> @bentoml.artifacts([SklearnModelArtifact('model')])
>>> class SklearnModelService(bentoml.BentoService):
>>>
>>>     @bentoml.api(DataframeHandler)
>>>     def predict(self, df):
>>>         result = self.artifacts.model.predict(df)
>>>         return result
>>>
>>> svc = SklearnModelService()
>>>
>>> # Pack directly with sklearn model object
>>> svc.pack('model', model_to_save)

PyTorch

Example Projects:

class bentoml.artifact.PytorchModelArtifact(name, file_extension='.pt')

Abstraction for saving/loading objects with torch.save and torch.load

Parameters

name (string) – name of the artifact

Raises
  • MissingDependencyException – torch package is required for PytorchModelArtifact

  • InvalidArgument – invalid argument type, model being packed must be instance of torch.nn.Module

Example usage:

>>> import torch.nn as nn
>>>
>>> class Net(nn.Module):
>>>     def __init__(self):
>>>         super(Net, self).__init__()
>>>         ...
>>>
>>>     def forward(self, x):
>>>         ...
>>>
>>> net = Net()
>>> # Train model with data
>>>
>>>
>>> import bentoml
>>> from bentoml.handlers import ImageHandler
>>> from bentoml.artifact import PytorchModelArtifact
>>>
>>> @bentoml.env(auto_pip_dependencies=True)
>>> @bentoml.artifacts([PytorchModelArtifact('net')])
>>> class PytorchModelService(bentoml.BentoService):
>>>
>>>     @bentoml.api(ImageHandler)
>>>     def predict(self, imgs):
>>>         outputs = self.artifacts.net(imgs)
>>>         return outputs
>>>
>>>
>>> svc = PytorchModelService()
>>>
>>> # Pytorch model can be packed directly.
>>> svc.pack('net', net)

Tensorflow 2.0

Example Projects:

class bentoml.artifact.TensorflowSavedModelArtifact(name)

Abstraction for saving/loading Tensorflow model in tf.saved_model format

Parameters

name (string) – name of the artifact

Raises

MissingDependencyException – tensorflow package is required for TensorflowSavedModelArtifact

Example usage:

>>> import tensorflow as tf
>>>
>>> # Option 1: custom model with specific method call
>>> class Adder(tf.Module):
>>>     @tf.function(input_signature=[tf.TensorSpec(shape=None, dtype=tf.float32)])
>>>     def add(self, x):
>>>         return x + x + 1.
>>> model_to_save = Adder()
>>> # ... compiling, training, etc
>>>
>>> # Option 2: Sequential model (direct call only)
>>> model_to_save = tf.keras.Sequential([
>>>     tf.keras.layers.Flatten(input_shape=(28, 28)),
>>>     tf.keras.layers.Dense(128, activation='relu'),
>>>     tf.keras.layers.Dense(10, activation='softmax')
>>> ])
>>> # ... compiling, training, etc
>>>
>>> import bentoml
>>> from bentoml.handlers import JsonHandler
>>> from bentoml.artifact import TensorflowSavedModelArtifact
>>>
>>> @bentoml.env(pip_dependencies=["tensorflow"])
>>> @bentoml.artifacts([TensorflowSavedModelArtifact('model')])
>>> class TfModelService(bentoml.BentoService):
>>>
>>>     @bentoml.api(JsonHandler)
>>>     def predict(self, json):
>>>         input_data = json['input']
>>>         prediction = self.artifacts.model.add(input_data)
>>>         # prediction = self.artifacts.model(input_data)  # if Sequential mode
>>>         return prediction.numpy()
>>>
>>> svc = TfModelService()
>>>
>>> # Option 1: pack directly with Tensorflow trackable object
>>> svc.pack('model', model_to_save)
>>>
>>> # Option 2: save to file path then pack
>>> tf.saved_model.save(model_to_save, '/tmp/adder/1')
>>> svc.pack('model', '/tmp/adder/1')

Tensorflow Keras

Example Projects:

class bentoml.artifact.KerasModelArtifact(name, custom_objects=None, model_extension='.h5', store_as_json_and_weights=False)

Abstraction for saving/loading Keras model

Parameters
  • name (string) – name of the artifact

  • custom_objects (dict) – dictionary of Keras custom objects for model

  • store_as_json_and_weights (bool) – flag allowing storage of the Keras model as JSON and weights

Raises
  • MissingDependencyException – keras or tensorflow.keras package is required for KerasModelArtifact

  • InvalidArgument – invalid argument type, model being packed must be instance of keras.engine.network.Network, tf.keras.models.Model, or their aliases

Example usage:

>>> from tensorflow import keras
>>> from tensorflow.keras.models import Sequential
>>> from tensorflow.keras.preprocessing import sequence, text
>>>
>>> model_to_save = Sequential()
>>> # traing model
>>> model_to_save.compile(...)
>>> model_to_save.fit(...)
>>>
>>> import bentoml
>>>
>>> @bentoml.env(pip_dependencies=['tensorflow==1.14.0', 'numpy', 'pandas'])
>>> @bentoml.artifacts([KerasModelArtifact('model')])
>>> class KerasModelService(bentoml.BentoService):
>>>     @bentoml.api(JsonHandler)
>>>     def predict(self, parsed_json):
>>>         input_data = text.text_to_word_sequence(parsed_json['text'])
>>>         return self.artifacts.model.predict_classes(input_data)
>>>
>>> svc = KerasModelService()
>>> svc.pack('model', model_to_save)

FastAI

Example Projects:

class bentoml.artifact.FastaiModelArtifact(name)

Saving and Loading FastAI Model

Parameters

name (str) – Name for the fastai model

Raises
  • MissingDependencyException – Require fastai package to use Fast ai model artifact

  • InvalidArgument – invalid argument type, model being packed must be instance of fastai.basic_train.Learner

Example usage:

>>> from fastai.tabular import *
>>>
>>> # prepare data
>>> data = TabularList.from_df(...)
>>> learn = tabular_learner(data, ...)
>>> # train model
>>>
>>> import bentoml
>>> from bentoml.handlers import DataframeHandler
>>> from bentoml.artifact import FastaiModelArtifact
>>>
>>> @bentoml.artifacts([FastaiModelArtifact('model')])
>>> @bentoml.env(auto_pip_dependencies=True)
>>> class FastaiModelService(bentoml.BentoService):
>>>
>>>     @api(DataframeHandler)
>>>     def predict(self, df):
>>>         results = []
>>>         for _, row in df.iterrows():
>>>             prediction = self.artifacts.model.predict(row)
>>>             results.append(prediction[0].obj)
>>>         return results
>>>
>>> svc = FastaiModelService()
>>>
>>> # Pack fastai basic_learner directly
>>> svc.pack('model', learn)
class bentoml.handlers.FastaiImageHandler(input_names='image', accept_image_formats=None, convert_mode='RGB', div=True, cls=None, after_open=None, **base_kwargs)

BentoHandler specified for handling image input following fastai conventions by passing type fastai.vision.Image to user API function and providing options such as div, cls, and after_open

Parameters
  • input_names ([str]]) – A tuple of acceptable input name for HTTP request. Default value is (image,)

  • accept_image_formats ([str]) – A list of acceptable image formats. Default value is loaded from bentoml config ‘apiserver/default_image_handler_accept_file_extensions’, which is set to [‘.jpg’, ‘.png’, ‘.jpeg’, ‘.tiff’, ‘.webp’, ‘.bmp’] by default. List of all supported format can be found here: https://imageio.readthedocs.io/en/stable/formats.html

  • convert_mode (str) – The pilmode to be used for reading image file into numpy array. Default value is ‘RGB’. Find more information at https://imageio.readthedocs.io/en/stable/format_png-pil.html

  • div (bool) – If True, pixel values are divided by 255 to become floats between 0. and 1.

  • cls (Class) – Parameter from fastai.vision open_image, default is fastai.vision.Image

  • after_open (func) – Parameter from fastai.vision open_image, default is None

Raises
  • ImportError – imageio package is required to use FastaiImageHandler

  • ImportError – fastai package is required to use FastaiImageHandler

XGBoost

Example Projects:

class bentoml.artifact.XgboostModelArtifact(name, model_extension='.model')

Abstraction for save/load object with Xgboost.

Parameters
  • name (string) – name of the artifact

  • model_extension (string) – Extension name for saved xgboost model

Raises
  • ImportError – xgboost package is required for using XgboostModelArtifact

  • TypeError – invalid argument type, model being packed must be instance of xgboost.core.Booster

Example usage:

>>> import xgboost
>>>
>>> # prepare data
>>> params = {... params}
>>> dtrain = xgboost.DMatrix(...)
>>>
>>> # train model
>>> model_to_save = xgboost.train(params=params, dtrain=dtrain)
>>>
>>> import bentoml
>>> from bentoml.artifact import XgboostModelArtifact
>>> from bentoml.handlers import DataframeHandler
>>>
>>> @bentoml.env(auto_pip_dependencies=True)
>>> @bentoml.artifacts(XgboostModelArtifact('model'))
>>> class XGBoostModelService(bentoml.BentoService):
>>>
>>>     @bentoml.api(DataframeHandler)
>>>     def predict(self, df):
>>>         result = self.artifacts.model.predict(df)
>>>         return result
>>>
>>> svc = XGBoostModelService()
>>> # Pack xgboost model
>>> svc.pack('model', model_to_save)

LightGBM

Example Projects:

class bentoml.artifact.LightGBMModelArtifact(name, model_extension='.txt')

Abstraction for save/load object with LightGBM.

Parameters
  • name (string) – name of the artifact

  • model_extension (string) – Extension name for saved xgboost model

Raises
  • MissingDependencyException – lightgbm package is required for using LightGBMModelArtifact

  • InvalidArgument – invalid argument type, model being packed must be instance of lightgbm.Booster

Example usage:

>>> import lightgbm as lgb
>>> # Prepare data
>>> train_data = lgb.Dataset(...)
>>> # train model
>>> model = lgb.train(train_set=train_data, ...)
>>>
>>> import bentoml
>>> from bentoml.artifact import LightGBMModelArtifact
>>> from bentoml.handlers import DataframeHandler
>>>
>>> @bentoml.artifacts([LightGBMModelArtifact('model')])
>>> @bentoml.env(auto_pip_dependencies=True)
>>> class LgbModelService(bentoml.BentoService):
>>>
>>>     @bentoml.api(DataframeHandler)
>>>     def predict(self, df):
>>>         return self.artifacts.model.predict(df)
>>>
>>> svc = LgbModelService()
>>> svc.pack('model', model)

FastText

Example Projects:

class bentoml.artifact.FasttextModelArtifact(name)

Abstraction for saving/loading fasttext models

Parameters

name (str) – Name for the artifact

Raises

MissingDependencyError – fasttext package is required for FasttextModelArtifact

Example usage:

>>> import fasttext
>>> # prepare training data and store to file
>>> training_data_file = 'trainging-data-file.train'
>>> model = fasttext.train_supervised(input=training_data_file)
>>>
>>> import bentoml
>>> from bentoml.handlers import JsonHandler
>>> from bentoml.artifact import FasttextModelArtifact
>>>
>>> @bentoml.env(auto_pip_dependencies=True)
>>> @bentoml.artifacts([FasttextModelArtifact('model')])
>>> class FasttextModelService(bentoml.BentoService):
>>>
>>>     @bentoml.api(JsonHandler)
>>>     def predict(self, parsed_json):
>>>         # K is the number of labels that successfully were predicted,
>>>         # among all the real labels
>>>         return self.artifacts.model.predict(parsed_json['text'], k=5)
>>>
>>> svc = FasttextModelService()
>>> svc.pack('model', model)

H2O

Example Projects:

class bentoml.artifact.H2oModelArtifact(name)

Abstraction for saving/loading objects with h2o.save_model and h2o.load_model

Parameters

name (str) – Name for this h2o artifact..

Raises

MissingDependencyException – h2o package is required to use H2o model artifact

Example usage:

>>> import h2o
>>> h2o.init()
>>>
>>> from h2o.estimators.deeplearning import H2ODeepLearningEstimator
>>> model_to_save = H2ODeepLearningEstimator(...)
>>> # train model with data
>>> data = h2o.import_file(...)
>>> model_to_save.train(...)
>>>
>>> import bentoml
>>> from bentoml.artifact import H2oModelArtifact
>>> from bentoml.handlers import DataframeHandler
>>>
>>> @bentoml.artifacts([H2oModelArtifact('model')])
>>> @bentoml.env(auto_pip_dependencies=True)
>>> class H2oModelService(bentoml.BentoService):
>>>
>>>     @bentoml.api(DataframeHandler)
>>>     def predict(self, df):
>>>         hf = h2o.H2OFrame(df)
>>>         predictions = self.artifacts.model.predict(hf)
>>>         return predictions.as_data_frame()
>>>
>>> svc = H2oModelService()
>>>
>>> svc.pack('model', model_to_save)