RED: Redundancy Rate

Description Usage Arguments Details Value Author(s) References Examples

View source: R/RED.R

Description

Calculate the redundancy rate of the selected features(markers). Value will be high if many redundant features are selected.

Usage

1
2
RED(x,spam_selected_feature_index,hsic_selected_feature_index,
integrated_selected_feature_index)

Arguments

x

a matrix of markers or explanatory variables, each column contains one marker and each row represents an individual.

spam_selected_feature_index

index of selected markers from x using Sparse Additive Model.

hsic_selected_feature_index

index of selected markers from x using HSIC LASSO.

integrated_selected_feature_index

index of selected markers from x using integrated model framework

Details

The RED score (Zhao et al., 2010) is determined by average of the correlation between each pair of selected markers. A large RED score signifies that selected features are more strongly correlated to each other which means many redundant features are selected. Thus, a small redundancy rate is preferable for feature selection.

Value

Returns a LIST containing

RED_spam

returns redundancy rate of features selected by using Sparse Additive Model.

RED_hsic

returns redundancy rate of features selected by using HSIC LASSO.

RED_I

returns redundancy rate of features selected by using integrated model framework.

Author(s)

Sayanti Guha Majumdar <sayanti23gm@gmail.com>, Anil Rai, Dwijesh Chandra Mishra

References

Guha Majumdar, S., Rai, A. and Mishra, D. C. (2019). Integrated framework for selection of additive and non-additive genetic markers for genomic selection. Journal of Computational Biology. doi:10.1089/cmb.2019.0223
Ravikumar, P., Lafferty, J., Liu, H. and Wasserman, L. (2009). Sparse additive models. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 71(5), 1009-1030. doi:10.1111/j.1467-9868.2009.00718.x
Yamada, M., Jitkrittum, W., Sigal, L., Xing, E. P. and Sugiyama, M. (2014). High-Dimensional Feature Selection by Feature-Wise Kernelized Lasso. Neural Computation, 26(1):185-207. doi:10.1162/NECO_a_00537
Zhao, Z., Wang, L. and Li, H. (2010). Efficient spectral feature selection with minimum redundancy. In AAAI Conference on Artificial Intelligence (AAAI), pp 673-678.

Examples

1
2
3
4
5
6
7
8
9
library(GSelection)
data(GS)
x_trn <- GS[1:40,1:110]
y_trn <- GS[1:40,111]
x_tst <- GS[41:60,1:110]
y_tst <- GS[41:60,111]
fit <- feature.selection(x_trn,y_trn,d=10)
red <- RED(x_trn,fit$spam_selected_feature_index,fit$hsic_selected_feature_index,
fit$integrated_selected_feature_index)

GSelection documentation built on Nov. 4, 2019, 5:06 p.m.