TensorFlowPredictor: TensorFlowPredictor Class

TensorFlowPredictorR Documentation

TensorFlowPredictor Class

Description

A “Predictor“ implementation for inference against TensorFlow Serving endpoints.

Super classes

sagemaker.mlcore::PredictorBase -> sagemaker.mlcore::Predictor -> TensorFlowPredictor

Methods

Public methods

Inherited methods

Method new()

Initialize a “TensorFlowPredictor“. See :class:'~sagemaker.predictor.Predictor' for more info about parameters.

Usage
TensorFlowPredictor$new(
  endpoint_name,
  sagemaker_session = NULL,
  serializer = JSONSerializer$new(),
  deserializer = JSONDeserializer$new(),
  model_name = NULL,
  model_version = NULL,
  ...
)
Arguments
endpoint_name

(str): The name of the endpoint to perform inference on.

sagemaker_session

(sagemaker.session.Session): Session object which manages interactions with Amazon SageMaker APIs and any other AWS services needed. If not specified, the estimator creates one using the default AWS configuration chain.

serializer

(callable): Optional. Default serializes input data to json. Handles dicts, lists, and numpy arrays.

deserializer

(callable): Optional. Default parses the response using “json.load(...)“.

model_name

(str): Optional. The name of the SavedModel model that should handle the request. If not specified, the endpoint's default model will handle the request.

model_version

(str): Optional. The version of the SavedModel model that should handle the request. If not specified, the latest version of the model will be used.

...

: Additional parameters passed to the Predictor constructor.


Method classify()

PlaceHolder

Usage
TensorFlowPredictor$classify(data)
Arguments
data

:


Method regress()

PlaceHolder

Usage
TensorFlowPredictor$regress(data)
Arguments
data

:


Method predict()

Return the inference from the specified endpoint.

Usage
TensorFlowPredictor$predict(data, initial_args = NULL)
Arguments
data

(object): Input data for which you want the model to provide inference. If a serializer was specified when creating the Predictor, the result of the serializer is sent as input data. Otherwise the data must be sequence of bytes, and the predict method then sends the bytes in the request body as is.

initial_args

(list[str,str]): Optional. Default arguments for boto3 “invoke_endpoint“ call. Default is NULL (no default arguments).


Method clone()

The objects of this class are cloneable with this method.

Usage
TensorFlowPredictor$clone(deep = FALSE)
Arguments
deep

Whether to make a deep clone.


DyfanJones/sagemaker-r-mlframework documentation built on March 18, 2022, 7:41 a.m.