ZVD: Zero Variance Discriminant Analysis

ZVDR Documentation

Zero Variance Discriminant Analysis

Description

Implements the ZVD algorithm to solve dicriminant vectors.

Usage

ZVD(A, ...)

## Default S3 method:
ZVD(A, scaling = FALSE, get_DVs = FALSE)

Arguments

A

Matrix, where first column corresponds to class labels.

...

Parameters passed to ZVD.default.

scaling

Logical whether to rescale data so each feature has variance 1.

get_DVs

Logical whether to obtain unpenalized zero-variance discriminant vectors.

Details

This function should potentially be made internal for the release.

Value

SZVDcv returns an object of class "ZVD" including a list with the following named components:

dvs

discriminant vectors (optional).

B

sample between-class covariance.

W

sample within-class covariance.

N

basis for the null space of the sample within-class covariance.

mu

training mean and variance scaling/centering terms

means

vectors of sample class-means.

k

number of classes in given data set.

labels

list of classes.

obs

matrix of data observations.

class_obs

Matrices of observations of each class.

NULL

See Also

Used by: SZVDcv.

Examples

  # Generate Gaussian data on three classes with bunch of redundant variables

  P <- 300 # Number of variables
  N <- 50 # Number of samples per class

  # Mean for classes, they are zero everywhere except the first 3 coordinates
  m1 <- rep(0,P)
  m1[1] <- 3

  m2 <- rep(0,P)
  m2[2] <- 3

  m3 <- rep(0,P)
  m3[3] <- 3

  # Sample dummy data
  Xtrain <- rbind(MASS::mvrnorm(n=N,mu = m1, Sigma = diag(P)),
              MASS::mvrnorm(n=N,mu = m2, Sigma = diag(P)),
              MASS::mvrnorm(n=N,mu = m3, Sigma = diag(P)))


  # Generate the labels
  Ytrain <- rep(1:3,each=N)

  # Normalize the data
  Xt <- accSDA::normalize(Xtrain)
  Xtrain <- Xt$Xc

  # Train the classifier and increase the sparsity parameter from the default
  # so we penalize more for non-sparse solutions.
  res <- accSDA::ZVD(cbind(Ytrain,Xtrain))

accSDA documentation built on Sept. 5, 2022, 5:05 p.m.