Types

class bentoml.types.InferenceTask(version: int = 0, data: Optional[Input] = None, error: Optional[bentoml.types.InferenceResult] = None, task_id: str = <factory>, is_discarded: bool = False, batch: Optional[int] = None, http_method: Optional[str] = None, http_headers: bentoml.types.HTTPHeaders = <HTTPHeaders()>, aws_lambda_event: Optional[dict] = None, cli_args: Optional[Sequence[str]] = None, inference_job_args: Optional[Mapping[str, Any]] = None)

The data structure passed to the BentoML API server for inferring. Contains payload data and context like HTTP headers or CLI args.

class bentoml.types.InferenceResult(version: int = 0, data: Optional[Output] = None, err_msg: str = '', task_id: Optional[str] = None, http_status: int = 501, http_headers: bentoml.types.HTTPHeaders = <HTTPHeaders()>, aws_lambda_event: Optional[dict] = None, cli_status: Optional[int] = 0)

The data structure that returned by BentoML API server. Contains result data and context like HTTP headers.

class bentoml.types.InferenceError(version: int = 0, data: Optional[Output] = None, err_msg: str = '', task_id: Optional[str] = None, http_status: int = 500, http_headers: bentoml.types.HTTPHeaders = <HTTPHeaders()>, aws_lambda_event: Optional[dict] = None, cli_status: int = 1)

The default InferenceResult when errors happened.