KFOCI: Kernel Feature Ordering by Conditional Independence In KPC: Kernel Partial Correlation Coefficient

Description

Variable selection with KPC using directed K-NN graph or minimum spanning tree (MST)

Usage

 1 2 3 4 5 6 7 8 9 10 KFOCI( Y, X, k = kernlab::rbfdot(1/(2 * stats::median(stats::dist(Y))^2)), Knn = 1, num_features = NULL, stop = TRUE, numCores = 1, verbose = FALSE )

Arguments

 Y a matrix of responses (n by dy) X a matrix of predictors (n by dx) k a function k(y, y') of class kernel. It can be the kernel implemented in kernlab e.g. Gaussian kernel: rbfdot(sigma = 1), linear kernel: vanilladot(). Knn the number of nearest neighbor; or "MST" num_features the number of variables to be selected, cannot be larger than dx. The default value is NULL and in that case it will be set equal to dx. If stop == TRUE (see below), then num_features is the maximal number of variables to be selected. stop If stop == TRUE, then the automatic stopping criterion (stops at the first instance of negative Tn, as mentioned in the paper) will be implemented and continued till num_features many variables are selected. If stop == FALSE then exactly num_features many variables are selected. numCores number of cores that are going to be used for parallelizing the process. verbose whether to print each selected variables during the forward stepwise algorithm

Details

A stepwise forward selection of variables using KPC. At each step the X_j maximizing \hat{ρ^2}(Y,X_j | selected X_i) is selected. It is suggested to normalize the predictors before applying KFOCI. Euclidean distance is used for computing the K-NN graph and the MST.

Value

The algorithm returns a vector of the indices from 1,...,dx of the selected variables