Core Components#

bentoml.Service#

class bentoml.Service(name, *, runners=None, models=None)[source]#

The service definition is the manifestation of the Service Oriented Architecture and the core building block in BentoML where users define the service runtime architecture and model serving logic.

A BentoML service is defined via instantiate this Service class. When creating a Service instance, user must provide a Service name and list of runners that are required by this Service. The instance can then be used to define InferenceAPIs via the api decorator.

add_asgi_middleware(middleware_cls, **options)[source]#
api(input, output, name=None, doc=None, route=None)[source]#

Decorator for adding InferenceAPI to this service

apis: t.Dict[str, InferenceAPI]#
mount_asgi_app(app, path='/', name=None)[source]#
mount_wsgi_app(app, path='/', name=None)[source]#
runners: t.List[Runner]#
bentoml.load(bento_identifier, working_dir=None, change_global_cwd=False)[source]#

Load a Service instance by the bento_identifier

A bento_identifier:str can be provided in three different forms:

  • Tag pointing to a Bento in local Bento store under BENTOML_HOME/bentos

  • File path to a Bento directory

  • “import_str” for loading a service instance from the working_dir

Example load from Bento usage:

# load from local bento store
load("FraudDetector:latest")
load("FraudDetector:4tht2icroji6zput")

# load from bento directory
load("~/bentoml/bentos/iris_classifier/4tht2icroji6zput")

Example load from working directory by “import_str” usage:

# When multiple service defined in the same module
load("fraud_detector:svc_a")
load("fraud_detector:svc_b")

# Find svc by Python module name or file path
load("fraud_detector:svc")
load("fraud_detector.py:svc")
load("foo.bar.fraud_detector:svc")
load("./def/abc/fraud_detector.py:svc")

# When there's only one Service instance in the target module, the attributes
# part in the svc_import_path can be omitted
load("fraud_detector.py")
load("fraud_detector")

Todo

Add docstring to the following classes/functions

bentoml.build#

bentoml.bentos.build(service, *, labels=None, description=None, include=None, exclude=None, docker=None, python=None, conda=None, version=None, build_ctx=None, _bento_store=<simple_di.providers.SingletonFactory object>, _model_store=<simple_di.providers.SingletonFactory object>)[source]#

User-facing API for building a Bento. The available build options are identical to the keys of a valid ‘bentofile.yaml’ file.

This API will not respect any ‘bentofile.yaml’ files. Build options should instead be provided via function call parameters.

Parameters
  • service (str) – import str for finding the bentoml.Service instance build target

  • labels (Optional[Dict[str, str]]) – optional immutable labels for carrying contextual info

  • description (Optional[str]) – optional description string in markdown format

  • include (Optional[List[str]]) – list of file paths and patterns specifying files to include in Bento, default is all files under build_ctx, beside the ones excluded from the exclude parameter or a .bentoignore file for a given directory

  • exclude (Optional[List[str]]) – list of file paths and patterns to exclude from the final Bento archive

  • docker (Optional[Dict[str, Any]]) – dictionary for configuring Bento’s containerization process, see details in bentoml._internal.bento.build_config.DockerOptions

  • python (Optional[Dict[str, Any]]) – dictionary for configuring Bento’s python dependencies, see details in bentoml._internal.bento.build_config.PythonOptions

  • conda (Optional[Dict[str, Any]]) – dictionary for configuring Bento’s conda dependencies, see details in bentoml._internal.bento.build_config.CondaOptions

  • version (Optional[str]) – Override the default auto generated version str

  • build_ctx (Optional[str]) – Build context directory, when used as

  • _bento_store (BentoStore) – save Bento created to this BentoStore

  • _model_store (ModelStore) – pull Models required from this ModelStore

Returns

a Bento instance representing the materialized Bento saved in BentoStore

Return type

Bento

Example

bentoml.bentos.build_bentofile(bentofile='bentofile.yaml', *, version=None, build_ctx=None, _bento_store=<simple_di.providers.SingletonFactory object>, _model_store=<simple_di.providers.SingletonFactory object>)[source]#

Build a Bento base on options specified in a bentofile.yaml file.

By default, this function will look for a bentofile.yaml file in current working directory.

Parameters
  • bentofile (str) – The file path to build config yaml file

  • version (Optional[str]) – Override the default auto generated version str

  • build_ctx (Optional[str]) – Build context directory, when used as

  • _bento_store (BentoStore) – save Bento created to this BentoStore

  • _model_store (ModelStore) – pull Models required from this ModelStore

bentoml.bentos.containerize(tag, docker_image_tag=None, *, add_host=None, allow=None, build_args=None, build_context=None, builder=None, cache_from=None, cache_to=None, cgroup_parent=None, iidfile=None, labels=None, load=True, metadata_file=None, network=None, no_cache=False, no_cache_filter=None, output=None, platform=None, progress='auto', pull=False, push=False, quiet=False, secrets=None, shm_size=None, rm=False, ssh=None, target=None, ulimit=None, _bento_store=<simple_di.providers.SingletonFactory object>)[source]#

bentoml.Bento#

class bentoml.Bento(tag, bento_fs, info)[source]#
property doc: str#
property info: BentoInfo#
property path: str#
path_of(item)[source]#
property tag: bentoml._internal.tag.Tag#

bentoml.Runner#

class bentoml.Runner(runnable_class, *, runnable_init_params=None, name=None, scheduling_strategy=<class 'bentoml._internal.runner.strategy.DefaultStrategy'>, models=None, cpu=None, nvidia_gpu=None, custom_resources=None, max_batch_size=None, max_latency_ms=None, method_configs=None)[source]#

bentoml.Runnable#

class bentoml.Runnable(**kwargs)[source]#

Tag#

class bentoml.Tag(name, version=None)[source]#

Model#

class bentoml.Model(tag, model_fs, info, custom_objects=None, *, _internal=False)[source]#
property info: bentoml._internal.models.model.ModelInfo#
property path: str#
path_of(item)[source]#
to_runnable()[source]#
to_runner(name='', cpu=None, nvidia_gpu=None, custom_resources=None, max_batch_size=None, max_latency_ms=None, method_configs=None)[source]#

TODO(chaoyu): add docstring

Parameters
  • name (str) –

  • cpu (int | None) –

  • nvidia_gpu (int | None) –

  • custom_resources (dict[str, float] | None) –

  • max_batch_size (int | None) –

  • max_latency_ms (int | None) –

  • runnable_method_configs

Returns:

with_options(**kwargs)[source]#

YataiClient#

class bentoml.YataiClient[source]#