TensorFlow#

About this page

This is an API reference for TensorFlow in BentoML. Please refer to TensorFlow for more information about how to use TensorFlow in BentoML.

Note

You can find more examples for TensorFlow in our `bentoml/examples https://github.com/bentoml/BentoML/tree/main/examples`_ directory.

bentoml.tensorflow.save_model(name: str, model: tf_ext.KerasModel | tf_ext.Module, *, tf_signatures: tf_ext.ConcreteFunction | None = None, tf_save_options: tf_ext.SaveOptions | None = None, signatures: dict[str, ModelSignature] | dict[str, ModelSignatureDict] | None = None, labels: dict[str, str] | None = None, custom_objects: dict[str, t.Any] | None = None, external_modules: list[ModuleType] | None = None, metadata: dict[str, t.Any] | None = None) bentoml.Model[source]#

Save a model instance to BentoML modelstore.

Parameters
  • name (str) – Name for given model instance. This should pass Python identifier check.

  • model (keras.Model | tf.Module) – Instance of model to be saved

  • tf_signatures (Union[Callable[..., Any], dict], optional, default to None) – Refer to Signatures explanation from Tensorflow documentation for more information.

  • tf_save_options (tf.saved_model.SaveOptions, optional, default to None) – tf.saved_model.SaveOptions object that specifies options for saving.

  • ( (signatures) –

    code: Dict[str, bool | BatchDimType | AnyType | tuple[AnyType]]) Methods to expose for running inference on the target model. Signatures are

    used for creating Runner instances when serving model with bentoml.Service

  • labels (Dict[str, str], optional, default to None) – user-defined labels for managing models, e.g. team=nlp, stage=dev

  • custom_objects (Dict[str, Any]], optional, default to None) – user-defined additional python objects to be saved alongside the model, e.g. a tokenizer instance, preprocessor function, model configuration json

  • external_modules (List[ModuleType], optional, default to None) – user-defined additional python modules to be saved alongside the model or custom objects, e.g. a tokenizer module, preprocessor module, model configuration module

  • metadata (Dict[str, Any], optional, default to None) – Custom metadata for given model.

Raises

ValueError – If obj is not trackable.

Returns

A tag with a format name:version where name is the user-defined model’s name, and a generated version by BentoML.

Return type

Tag

Examples:

import tensorflow as tf
import numpy as np
import bentoml

class NativeModel(tf.Module):
    def __init__(self):
        super().__init__()
        self.weights = np.asfarray([[1.0], [1.0], [1.0], [1.0], [1.0]])
        self.dense = lambda inputs: tf.matmul(inputs, self.weights)

    @tf.function(
        input_signature=[tf.TensorSpec(shape=[1, 5], dtype=tf.float64, name="inputs")]
    )
    def __call__(self, inputs):
        return self.dense(inputs)

# then save the given model to BentoML modelstore:
model = NativeModel()
bento_model = bentoml.tensorflow.save_model("native_toy", model)

Note

bentoml.tensorflow.save_model API also support saving RaggedTensor model and Keras model. If you choose to save a Keras model with bentoml.tensorflow.save_model, then the model will be saved under a SavedModel format instead of h5.

bentoml.tensorflow.load_model(bento_model: str | Tag | bentoml.Model, device_name: str = '/device:CPU:0') tf_ext.AutoTrackable | tf_ext.Module[source]#

Load a tensorflow model from BentoML local modelstore with given name.

Parameters
  • bento_model (str | Tag | Model) – Either the tag of the model to get from the store, or a BentoML ~bentoml.Model instance to load the model from.

  • device_name (str | None) – The device id to load the model on. The device id format should be compatible with tf.device

Returns

an instance of SavedModel format from BentoML modelstore.

Return type

SavedModel

Examples:

import bentoml

# load a model back into memory
model = bentoml.tensorflow.load_model("my_tensorflow_model")
bentoml.tensorflow.get(tag_like: str | bentoml._internal.tag.Tag) Model[source]#