TensorFlow#
About this page
This is an API reference for TensorFlow in BentoML. Please refer to TensorFlow for more information about how to use TensorFlow in BentoML.
Note
You can find more examples for TensorFlow in our BentoML/examples directory.
- bentoml.tensorflow.save_model(name: str, model: tf_ext.KerasModel | tf_ext.Module, *, tf_signatures: tf_ext.ConcreteFunction | None = None, tf_save_options: tf_ext.SaveOptions | None = None, signatures: dict[str, ModelSignature] | dict[str, ModelSignatureDict] | None = None, labels: dict[str, str] | None = None, custom_objects: dict[str, t.Any] | None = None, external_modules: list[ModuleType] | None = None, metadata: dict[str, t.Any] | None = None) bentoml.Model [source]#
Save a model instance to BentoML modelstore.
- Parameters:
name β Name for given model instance. This should pass Python identifier check.
model β Instance of model to be saved
tf_signatures β Refer to Signatures explanation from Tensorflow documentation for more information.
tf_save_options β TensorFlow save options..
signatures β Methods to expose for running inference on the target model. Signatures are used for creating Runner instances when serving model with bentoml.Service
labels β user-defined labels for managing models, e.g. team=nlp, stage=dev
custom_objects β user-defined additional python objects to be saved alongside the model, e.g. a tokenizer instance, preprocessor function, model configuration json
external_modules β user-defined additional python modules to be saved alongside the model or custom objects, e.g. a tokenizer module, preprocessor module, model configuration module
metadata β Custom metadata for given model.
- Raises:
ValueError β If
obj
is not trackable.- Returns:
A
tag
with a formatname:version
wherename
is the user-defined modelβs name, and a generatedversion
by BentoML.- Return type:
Examples:
import tensorflow as tf import numpy as np import bentoml class NativeModel(tf.Module): def __init__(self): super().__init__() self.weights = np.asfarray([[1.0], [1.0], [1.0], [1.0], [1.0]]) self.dense = lambda inputs: tf.matmul(inputs, self.weights) @tf.function( input_signature=[tf.TensorSpec(shape=[1, 5], dtype=tf.float64, name="inputs")] ) def __call__(self, inputs): return self.dense(inputs) # then save the given model to BentoML modelstore: model = NativeModel() bento_model = bentoml.tensorflow.save_model("native_toy", model)
Note
bentoml.tensorflow.save_model
API also support saving RaggedTensor model and Keras model. If you choose to save a Keras model withbentoml.tensorflow.save_model
, then the model will be saved under aSavedModel
format instead ofh5
.
- bentoml.tensorflow.load_model(bento_model: str | Tag | bentoml.Model, device_name: str = '/device:CPU:0') tf_ext.AutoTrackable | tf_ext.Module [source]#
Load a tensorflow model from BentoML local modelstore with given name.
- Parameters:
bento_model β Either the tag of the model to get from the store, or a BentoML ~bentoml.Model instance to load the model from.
device_name β The device id to load the model on. The device id format should be compatible with tf.device
- Returns:
an instance of
SavedModel
format from BentoML modelstore.- Return type:
SavedModel
Examples:
import bentoml # load a model back into memory model = bentoml.tensorflow.load_model("my_tensorflow_model")
- bentoml.tensorflow.get(tag_like: str | bentoml._internal.tag.Tag) Model [source]#