HuggingFaceModel | R Documentation |
A Hugging Face SageMaker “Model“ that can be deployed to a SageMaker “Endpoint“.
sagemaker.mlcore::ModelBase
-> sagemaker.mlcore::Model
-> sagemaker.mlcore::FrameworkModel
-> HuggingFaceModel
new()
Initialize a HuggingFaceModel.
HuggingFaceModel$new( role, model_data = NULL, entry_point = NULL, transformers_version = NULL, tensorflow_version = NULL, pytorch_version = NULL, py_version = NULL, image_uri = NULL, predictor_cls = HuggingFacePredictor, model_server_workers = NULL, ... )
role
(str): An AWS IAM role specified with either the name or full ARN. The Amazon SageMaker training jobs and APIs that create Amazon SageMaker endpoints use this role to access training data and model artifacts. After the endpoint is created, the inference code might use the IAM role, if it needs to access an AWS resource.
model_data
(str): The Amazon S3 location of a SageMaker model data “.tar.gz“ file.
entry_point
(str): The absolute or relative path to the Python source file that should be executed as the entry point to model hosting. If “source_dir“ is specified, then “entry_point“ must point to a file located at the root of “source_dir“. Defaults to None.
transformers_version
(str): Transformers version you want to use for executing your model training code. Defaults to None. Required unless “image_uri“ is provided.
tensorflow_version
(str): TensorFlow version you want to use for executing your inference code. Defaults to “None“. Required unless “pytorch_version“ is provided. List of supported versions: https://github.com/aws/sagemaker-python-sdk#huggingface-sagemaker-estimators.
pytorch_version
(str): PyTorch version you want to use for executing your inference code. Defaults to “None“. Required unless “tensorflow_version“ is provided. List of supported versions: https://github.com/aws/sagemaker-python-sdk#huggingface-sagemaker-estimators.
py_version
(str): Python version you want to use for executing your model training code. Defaults to “None“. Required unless “image_uri“ is provided.
image_uri
(str): A Docker image URI. Defaults to None. If not specified, a default image for PyTorch will be used. If “framework_version“ or “py_version“ are “None“, then “image_uri“ is required. If also “None“, then a “ValueError“ will be raised.
predictor_cls
(callable[str, sagemaker.session.Session]): A function to call to create a predictor with an endpoint name and SageMaker “Session“. If specified, “deploy()“ returns the result of invoking this function on the created endpoint name.
model_server_workers
(int): Optional. The number of worker processes used by the inference server. If None, server will use one worker per vCPU.
...
: Keyword arguments passed to the superclass :class:'~sagemaker.model.FrameworkModel' and, subsequently, its superclass :class:'~sagemaker.model.Model'.,
register()
Creates a model package for creating SageMaker models or listing on Marketplace.
HuggingFaceModel$register( content_types, response_types, inference_instances, transform_instances, model_package_name = NULL, model_package_group_name = NULL, image_uri = NULL, model_metrics = NULL, metadata_properties = NULL, marketplace_cert = FALSE, approval_status = NULL, description = NULL, drift_check_baselines = NULL )
content_types
(list): The supported MIME types for the input data.
response_types
(list): The supported MIME types for the output data.
inference_instances
(list): A list of the instance types that are used to generate inferences in real-time.
transform_instances
(list): A list of the instance types on which a transformation job can be run or on which an endpoint can be deployed.
model_package_name
(str): Model Package name, exclusive to 'model_package_group_name', using 'model_package_name' makes the Model Package un-versioned. Defaults to “None“.
model_package_group_name
(str): Model Package Group name, exclusive to 'model_package_name', using 'model_package_group_name' makes the Model Package versioned. Defaults to “None“.
image_uri
(str): Inference image URI for the container. Model class' self.image will be used if it is None. Defaults to “None“.
model_metrics
(ModelMetrics): ModelMetrics object. Defaults to “None“.
metadata_properties
(MetadataProperties): MetadataProperties object. Defaults to “None“.
marketplace_cert
(bool): A boolean value indicating if the Model Package is certified for AWS Marketplace. Defaults to “False“.
approval_status
(str): Model Approval Status, values can be "Approved", "Rejected", or "PendingManualApproval". Defaults to “PendingManualApproval“.
description
(str): Model Package description. Defaults to “None“.
drift_check_baselines
(DriftCheckBaselines): DriftCheckBaselines object (default: None)
A 'sagemaker.model.ModelPackage' instance.
prepare_container_def()
A container definition with framework configuration set in model environment variables.
HuggingFaceModel$prepare_container_def( instance_type = NULL, accelerator_type = NULL )
instance_type
(str): The EC2 instance type to deploy this Model to. For example, 'ml.p2.xlarge'.
accelerator_type
(str): The Elastic Inference accelerator type to deploy to the instance for loading and making inferences to the model.
dict[str, str]: A container definition object usable with the CreateModel API.
serving_image_uri()
Create a URI for the serving image.
HuggingFaceModel$serving_image_uri( region_name, instance_type, accelerator_type = NULL )
region_name
(str): AWS region where the image is uploaded.
instance_type
(str): SageMaker instance type. Used to determine device type (cpu/gpu/family-specific optimized).
accelerator_type
(str): The Elastic Inference accelerator type to deploy to the instance for loading and making inferences to the model.
str: The appropriate image URI based on the given parameters.
clone()
The objects of this class are cloneable with this method.
HuggingFaceModel$clone(deep = FALSE)
deep
Whether to make a deep clone.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.