FactorizationMachines: A supervised learning algorithm used in classification and...

FactorizationMachinesR Documentation

A supervised learning algorithm used in classification and regression.

Description

Factorization Machines combine the advantages of Support Vector Machines with factorization models. It is an extension of a linear model that is designed to capture interactions between features within high dimensional sparse datasets economically.

Super classes

sagemaker.mlcore::EstimatorBase -> sagemaker.mlcore::AmazonAlgorithmEstimatorBase -> FactorizationMachines

Public fields

repo_name

sagemaker repo name for framework

repo_version

version of framework

.module

mimic python module

Active bindings

num_factors

Dimensionality of factorization.

predictor_type

Type of predictor 'binary_classifier' or 'regressor'.

epochs

Number of training epochs to run.

clip_gradient

Clip the gradient by projecting onto the box [-clip_gradient, +clip_gradient]

eps

Small value to avoid division by 0.

rescale_grad

If set, multiplies the gradient with rescale_grad before updating

bias_lr

Non-negative learning rate for the bias term.

linear_lr

Non-negative learning rate for linear terms.

factors_lr

Non-negative learning rate for factorization terms.

bias_wd

Non-negative weight decay for the bias term.

linear_wd

Non-negative weight decay for linear terms.

factors_wd

Non-negative weight decay for factorization terms.

bias_init_method

Initialization method for the bias term: 'normal', 'uniform' or 'constant'.

bias_init_scale

Non-negative range for initialization of the bias term that takes effect when bias_init_method parameter is 'uniform'

bias_init_sigma

Non-negative standard deviation for initialization of the bias term that takes effect when bias_init_method parameter is 'normal'.

bias_init_value

Initial value of the bias term that takes effect when bias_init_method parameter is 'constant'.

linear_init_method

Initialization method for linear term: normal', 'uniform' or 'constant'.

linear_init_scale

on-negative range for initialization of linear terms that takes effect when linear_init_method parameter is 'uniform'.

linear_init_sigma

Non-negative standard deviation for initialization of linear terms that takes effect when linear_init_method parameter is 'normal'.

linear_init_value

Initial value of linear terms that takes effect when linear_init_method parameter is 'constant'.

factors_init_method

Initialization method for factorization term: 'normal', 'uniform' or 'constant'.

factors_init_scale

Non-negative range for initialization of factorization terms that takes effect when factors_init_method parameter is 'uniform'.

factors_init_sigma

Non-negative standard deviation for initialization of factorization terms that takes effect when factors_init_method parameter is 'normal'.

factors_init_value

Initial value of factorization terms that takes effect when factors_init_method parameter is constant'.

Methods

Public methods

Inherited methods

Method new()

Factorization Machines is :class:'Estimator' for general-purpose supervised learning. Amazon SageMaker Factorization Machines is a general-purpose supervised learning algorithm that you can use for both classification and regression tasks. It is an extension of a linear model that is designed to parsimoniously capture interactions between features within high dimensional sparse datasets. This Estimator may be fit via calls to :meth:'~sagemaker.amazon.amazon_estimator.AmazonAlgorithmEstimatorBase.fit'. It requires Amazon :class:'~sagemaker.amazon.record_pb2.Record' protobuf serialized data to be stored in S3. There is an utility :meth:'~sagemaker.amazon.amazon_estimator.AmazonAlgorithmEstimatorBase.record_set' that can be used to upload data to S3 and creates :class:'~sagemaker.amazon.amazon_estimator.RecordSet' to be passed to the 'fit' call. To learn more about the Amazon protobuf Record class and how to prepare bulk data in this format, please consult AWS technical documentation: https://docs.aws.amazon.com/sagemaker/latest/dg/cdf-training.html After this Estimator is fit, model data is stored in S3. The model may be deployed to an Amazon SageMaker Endpoint by invoking :meth:'~sagemaker.amazon.estimator.EstimatorBase.deploy'. As well as deploying an Endpoint, deploy returns a :class:'~sagemaker.amazon.pca.FactorizationMachinesPredictor' object that can be used for inference calls using the trained model hosted in the SageMaker Endpoint. FactorizationMachines Estimators can be configured by setting hyperparameters. The available hyperparameters for FactorizationMachines are documented below. For further information on the AWS FactorizationMachines algorithm, please consult AWS technical documentation: https://docs.aws.amazon.com/sagemaker/latest/dg/fact-machines.html

Usage
FactorizationMachines$new(
  role,
  instance_count,
  instance_type,
  num_factors,
  predictor_type,
  epochs = NULL,
  clip_gradient = NULL,
  eps = NULL,
  rescale_grad = NULL,
  bias_lr = NULL,
  linear_lr = NULL,
  factors_lr = NULL,
  bias_wd = NULL,
  linear_wd = NULL,
  factors_wd = NULL,
  bias_init_method = NULL,
  bias_init_scale = NULL,
  bias_init_sigma = NULL,
  bias_init_value = NULL,
  linear_init_method = NULL,
  linear_init_scale = NULL,
  linear_init_sigma = NULL,
  linear_init_value = NULL,
  factors_init_method = NULL,
  factors_init_scale = NULL,
  factors_init_sigma = NULL,
  factors_init_value = NULL,
  ...
)
Arguments
role

(str): An AWS IAM role (either name or full ARN). The Amazon SageMaker training jobs and APIs that create Amazon SageMaker endpoints use this role to access training data and model artifacts. After the endpoint is created, the inference code might use the IAM role, if accessing AWS resource.

instance_count

(int): Number of Amazon EC2 instances to use for training.

instance_type

(str): Type of EC2 instance to use for training, for example, 'ml.c4.xlarge'.

num_factors

(int): Dimensionality of factorization.

predictor_type

(str): Type of predictor 'binary_classifier' or 'regressor'.

epochs

(int): Number of training epochs to run.

clip_gradient

(float): Optimizer parameter. Clip the gradient by projecting onto the box [-clip_gradient, +clip_gradient]

eps

(float): Optimizer parameter. Small value to avoid division by 0.

rescale_grad

(float): Optimizer parameter. If set, multiplies the gradient with rescale_grad before updating. Often choose to be 1.0/batch_size.

bias_lr

(float): Non-negative learning rate for the bias term.

linear_lr

(float): Non-negative learning rate for linear terms.

factors_lr

(float): Non-negative learning rate for factorization terms.

bias_wd

(float): Non-negative weight decay for the bias term.

linear_wd

(float): Non-negative weight decay for linear terms.

factors_wd

(float): Non-negative weight decay for factorization terms.

bias_init_method

(string): Initialization method for the bias term: 'normal', 'uniform' or 'constant'.

bias_init_scale

(float): Non-negative range for initialization of the bias term that takes effect when bias_init_method parameter is 'uniform'

bias_init_sigma

(float): Non-negative standard deviation for initialization of the bias term that takes effect when bias_init_method parameter is 'normal'.

bias_init_value

(float): Initial value of the bias term that takes effect when bias_init_method parameter is 'constant'.

linear_init_method

(string): Initialization method for linear term: 'normal', 'uniform' or 'constant'.

linear_init_scale

(float): Non-negative range for initialization of linear terms that takes effect when linear_init_method parameter is 'uniform'.

linear_init_sigma

(float): Non-negative standard deviation for initialization of linear terms that takes effect when linear_init_method parameter is 'normal'.

linear_init_value

(float): Initial value of linear terms that takes effect when linear_init_method parameter is 'constant'.

factors_init_method

(string): Initialization method for factorization term: 'normal', 'uniform' or 'constant'.

factors_init_scale

(float): Non-negative range for initialization of factorization terms that takes effect when factors_init_method parameter is 'uniform'.

factors_init_sigma

(float): Non-negative standard deviation for initialization of factorization terms that takes effect when factors_init_method parameter is 'normal'.

factors_init_value

(float): Initial value of factorization terms that takes effect when factors_init_method parameter is 'constant'.

...

: base class keyword argument values. You can find additional parameters for initializing this class at :class:'~sagemaker.estimator.amazon_estimator.AmazonAlgorithmEstimatorBase' and :class:'~sagemaker.estimator.EstimatorBase'.


Method create_model()

Return a :class:'~sagemaker.amazon.FactorizationMachinesModel' referencing the latest s3 model data produced by this Estimator.

Usage
FactorizationMachines$create_model(
  vpc_config_override = "VPC_CONFIG_DEFAULT",
  ...
)
Arguments
vpc_config_override

(dict[str, list[str]]): Optional override for VpcConfig set on the model. Default: use subnets and security groups from this Estimator. * 'Subnets' (list[str]): List of subnet ids. * 'SecurityGroupIds' (list[str]): List of security group ids.

...

: Additional kwargs passed to the FactorizationMachinesModel constructor.


Method clone()

The objects of this class are cloneable with this method.

Usage
FactorizationMachines$clone(deep = FALSE)
Arguments
deep

Whether to make a deep clone.


DyfanJones/sagemaker-r-mlframework documentation built on March 18, 2022, 7:41 a.m.