sagemaker_create_inference_component: Creates an inference component, which is a SageMaker hosting...

View source: R/sagemaker_operations.R

sagemaker_create_inference_componentR Documentation

Creates an inference component, which is a SageMaker hosting object that you can use to deploy a model to an endpoint

Description

Creates an inference component, which is a SageMaker hosting object that you can use to deploy a model to an endpoint. In the inference component settings, you specify the model, the endpoint, and how the model utilizes the resources that the endpoint hosts. You can optimize resource utilization by tailoring how the required CPU cores, accelerators, and memory are allocated. You can deploy multiple inference components to an endpoint, where each inference component contains one model and the resource utilization needs for that individual model. After you deploy an inference component, you can directly invoke the associated model when you use the InvokeEndpoint API action.

See https://www.paws-r-sdk.com/docs/sagemaker_create_inference_component/ for full documentation.

Usage

sagemaker_create_inference_component(
  InferenceComponentName,
  EndpointName,
  VariantName,
  Specification,
  RuntimeConfig,
  Tags = NULL
)

Arguments

InferenceComponentName

[required] A unique name to assign to the inference component.

EndpointName

[required] The name of an existing endpoint where you host the inference component.

VariantName

[required] The name of an existing production variant where you host the inference component.

Specification

[required] Details about the resources to deploy with this inference component, including the model, container, and compute resources.

RuntimeConfig

[required] Runtime settings for a model that is deployed with an inference component.

Tags

A list of key-value pairs associated with the model. For more information, see Tagging Amazon Web Services resources in the Amazon Web Services General Reference.


paws.machine.learning documentation built on Sept. 12, 2024, 6:23 a.m.