Bagging.lasso: A Bagging Prediction Model Using LASSO Selection Algorithm.

Description Usage Arguments Details Value References Examples

View source: R/Bagging.lasso.R

Description

This function performs a bagging prediction for linear and logistic regression model using the LASSO selection algorithm.

Usage

1
2
3
4
Bagging.lasso(x, y, family = c("gaussian", "binomial"), M = 100, subspace.size = 10,
             predictor.subset = round((9/10) * ncol(x)), boot.scale = 1, kfold = 10, 
             predictor.importance = TRUE, trimmed = FALSE, weighted = TRUE, 
             verbose = TRUE, seed = 123)

Arguments

x

input matrix. The dimension of the matrix is nobs x nvars; each row is a vector of observations of the variables.

y

response variable. For family="gaussian", y is a vector of quantitative response. For family="binomial" should be a factor with two levels '0' and '1' and the level of '1' is the target class.

family

response type (see above).

M

the number of base-level models (LASSO linear or logistic regression models) to obtain a final prediction. Note that it also corresponds to the number of bootstrap samples to draw. Defaults to 100.

subspace.size

the number of random subspaces to construct an ensemble prediction model. Defaults to 10.

predictor.subset

the subset of randomly selected predictors from the training set to reduce the original p-dimensional feature space. Defaults to (9/10)*ncol(x) where ncol(x) represents the the original p-dimensional feature space of input matrix x.

boot.scale

the scale of sample size in each bootstrap re-sampling, relative to the original sample size. Defaults to 1.0, equaling to the original size of training samples.

kfold

the number of folds of cross validation - default is 10. Although kfold can be as large as the sample size (leave-one-out CV), it is not recommended for large datasets. Smallest value allowable is kfold=3.

predictor.importance

logical. Should the importance of each predictor in the bagging LASSO model be evaluated? Defaults to TRUE. A permutation-based variable importance measure estimated by the out-of-bag error rate is adapted for the bagging model.

trimmed

logical. Should a trimmed bagging strategy be performed? Defaults to FALSE. Traditional bagging draws bootstrap samples from the training sample, applies the base-level model to each bootstrap sample, and then averages over all obtained prediction rules. The idea of trimmed bagging is to exclude the bootstrapped prediction rules that yield the highest error rates and to aggregate over the remaining ones.

weighted

logical. Should a weighted rank aggregation procedure be performed? Defaults to TRUE. This procedure uses a Monte Carlo cross-entropy algorithm combining the ranks of a set of based-level model under consideration via a weighted aggregation that optimizes a distance criterion to determine the best performance base-level model.

verbose

logical. Should the iterative process information of bagging model be presented? Defaults to TRUE.

seed

the seed for random sampling, with the default value 0123.

Details

This bagging LASSO model Bagging.lasso generates an ensemble prediction based on the L1-regularized linear or logistic regression models. The Bagging.lasso function uses a Monte Carlo cross-entropy algorithm to combine the ranks of a set of based-level LASSO regression model under consideration via a weighted aggregation to determine the best base-level model. In the Bagging.lasso, the glmnet algorithm is performed to fit LASSO model paths for linear and logistic regression using coordinate descent. A random subspace method is employed to improve the predictive performance. In addition, a strategy of trimmed bagging can be defined to exclude the bootstrapped prediction rules that yield the highest error rates and to aggregate over the remaining prediction rules.

Value

family

the response type.

M

the number of base-level models to obtain a bagging prediction.

predictor.subset

the subset of randomly selected predictors from the training set to reduce the original p-dimensional feature space.

subspace.size

the number of random subspaces to construct an ensemble prediction model.

validation.metric

the model validation measures.

boot.scale

the scale of sample size in each bootstrap re-sampling, relative to the original sample size.

distance

the distance function used in the weighted aggregation to define the similarity between each two sets of based-level model.

models.fitted

the base-level LASSO regression models fitted by the Bagging.lasso function.

models.trimmed

the trimmed base-level models fitted by the Bagging.lasso function if the trimmed bagging strategy is performed.

y.true

the true values of reponse vector y.

conv.scores

the score matrix generated in the Monte Carlo cross-entropy algorithm according to the validation measures defined.

importance

the importance socres of variables identified by the Bagging.lasso model.

References

[1] Guo, P., Zeng, F., Hu, X., Zhang, D., Zhu, S., Deng, Y., & Hao, Y. (2015). Improved Variable Selection Algorithm Using a LASSO-Type Penalty, with an Application to Assessing Hepatitis B Infection Relevant Factors in Community Residents. PLoS One, 27;10(7):e0134151.

[2] Tibshirani, R. (1996). Regression Shrinkage and Selection via the Lasso. Journal of the royal statistical society series B (statistical methodology), 73(3):273-282.

[3] Breiman, L. (2001). Random Forests. Machine Learning, 45(1), 5-32.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
# Example 1: Bagging LASSO linear regression model.
library(mlbench)
set.seed(0123)
mydata <- mlbench.threenorm(100, d=10)
x <- mydata$x
y <- mydata$classes
mydata <- as.data.frame(cbind(x, y))
colnames(mydata) <- c(paste("A", 1:10, sep=""), "y")
mydata$y <- ifelse(mydata$y==1, 0, 1)
# Split into training and testing data.
S1 <- as.vector(which(mydata$y==0))
S2 <- as.vector(which(mydata$y==1))
S3 <- sample(S1, ceiling(length(S1)*0.8), replace=FALSE)
S4 <- sample(S2, ceiling(length(S2)*0.8), replace=FALSE)
TrainInd <- c(S3, S4)
TestInd <- setdiff(1:length(mydata$y), TrainInd)
TrainXY <- mydata[TrainInd, ]
TestXY <- mydata[TestInd, ]
# Fit a bagging LASSO linear regression model, where the parameters
# of M in the following example is set as small values to reduce the
# running time, however the default value is proposed.
Bagging.fit <- Bagging.lasso(x=TrainXY[, -10], y=TrainXY[, 10],
family=c("gaussian"), M=2, predictor.subset=round((9/10)*ncol(x)),
predictor.importance=TRUE, trimmed=FALSE, weighted=TRUE, seed=0123)
# Print a 'bagging' object fitted by the Bagging.fit function.
Print.bagging(Bagging.fit)
# Make predictions from a bagging LASSO linear regression model.
pred <- Predict.bagging(Bagging.fit, newx=TestXY[, -10], y=NULL, trimmed=FALSE)
pred
# Generate the plot of variable importance.
Plot.importance(Bagging.fit)
# Example 2: Bagging LASSO logistic regression model.
library(mlbench)
set.seed(0123)
mydata <- mlbench.threenorm(100, d=10)
x <- mydata$x
y <- mydata$classes
mydata <- as.data.frame(cbind(x, y))
colnames(mydata) <- c(paste("A", 1:10, sep=""), "y")
mydata$y <- ifelse(mydata$y==1, 0, 1)
# Split into training and testing data.
S1 <- as.vector(which(mydata$y==0))
S2 <- as.vector(which(mydata$y==1))
S3 <- sample(S1, ceiling(length(S1)*0.8), replace=FALSE)
S4 <- sample(S2, ceiling(length(S2)*0.8), replace=FALSE)
TrainInd <- c(S3, S4)
TestInd <- setdiff(1:length(mydata$y), TrainInd)
TrainXY <- mydata[TrainInd, ]
TestXY <- mydata[TestInd, ]
# Fit a bagging LASSO logistic regression model, where the parameters
# of M in the following example is set as small values to reduce the
# running time, however the default value is proposed.
Bagging.fit <- Bagging.lasso(x=TrainXY[, -11], y=TrainXY[, 11],
family=c("binomial"), M=2, predictor.subset=round((9/10)*ncol(x)),
predictor.importance=TRUE, trimmed=FALSE, weighted=TRUE, seed=0123)
# Print a 'bagging' object fitted by the Bagging.fit function.
Print.bagging(Bagging.fit)
# Make predictions from a bagging LASSO logistic regression model.
pred <- Predict.bagging(Bagging.fit, newx=TestXY[, -11], y=NULL, trimmed=FALSE)
pred
# Generate the plot of variable importance.
Plot.importance(Bagging.fit)

Example output

Loading required package: glmnet
Loading required package: Matrix
Loading required package: foreach
Loaded glmnet 2.0-12

Iter  1 
Iter  2 
$family
[1] "gaussian"

$M
[1] 2

$predictor.subset
[1] 9

$subspace.size
[1] 10

$validation.metric
[1] "rmse"  "mae"   "re"    "smape"

$boot.scale
[1] 1

$distance
[1] "Spearman"

$models.fitted
$models.fitted[[1]]
$lambda
 [1] 0.334996951 0.305236745 0.278120354 0.253412908 0.230900404 0.210387848
 [7] 0.191697572 0.174667688 0.159150692 0.145012183 0.132129700 0.120391662
[13] 0.109696399 0.099951273 0.091071877 0.082981303 0.075609472 0.068892535
[19] 0.062772312 0.057195793 0.052114677 0.047484952 0.043266520 0.039422842
[25] 0.035920625 0.032729536 0.029821934 0.027172636 0.024758693 0.022559199
[31] 0.020555102 0.018729044 0.017065207 0.015549181 0.014167835 0.012909203
[37] 0.011762385 0.010717447 0.009765339 0.008897813 0.008107356 0.007387121
[43] 0.006730869 0.006132917 0.005588086 0.005091656 0.004639327 0.004227182
[49] 0.003851651 0.003509481 0.003197708 0.002913633 0.002654794 0.002418949
[55] 0.002204056 0.002008254 0.001829846 0.001667288 0.001519170 0.001384212
[61] 0.001261242 0.001149197 0.001047105

$cvm
 [1] 1.170612 1.171834 1.167885 1.158435 1.148482 1.139699 1.131608 1.123252
 [9] 1.115103 1.106609 1.097212 1.086507 1.077951 1.070438 1.063447 1.057730
[17] 1.053037 1.049194 1.046098 1.043674 1.042377 1.042595 1.043767 1.045995
[25] 1.048606 1.051590 1.054781 1.057938 1.061084 1.064116 1.066907 1.069368
[33] 1.071647 1.073573 1.075355 1.077017 1.078578 1.080044 1.081413 1.082683
[41] 1.083853 1.084938 1.085952 1.086871 1.087723 1.088511 1.089230 1.089902
[49] 1.090525 1.091127 1.091690 1.092116 1.092455 1.092752 1.093022 1.093276
[57] 1.093510 1.093738 1.093911 1.094093 1.094257 1.094407 1.094545

$cvsd
 [1] 0.1485115 0.1478064 0.1468317 0.1447677 0.1425382 0.1404797 0.1383893
 [8] 0.1357820 0.1337884 0.1324965 0.1314530 0.1307501 0.1306333 0.1302056
[15] 0.1293205 0.1287382 0.1283754 0.1281913 0.1281387 0.1281642 0.1280273
[22] 0.1277982 0.1276712 0.1275375 0.1274735 0.1274279 0.1274987 0.1276283
[29] 0.1277789 0.1279861 0.1281145 0.1280807 0.1280520 0.1280708 0.1280963
[36] 0.1281242 0.1281533 0.1281877 0.1282255 0.1282668 0.1283050 0.1283450
[43] 0.1283894 0.1284260 0.1284595 0.1284920 0.1285241 0.1285561 0.1285885
[50] 0.1286352 0.1286842 0.1287594 0.1288378 0.1289120 0.1289809 0.1290423
[57] 0.1290980 0.1291455 0.1291952 0.1292374 0.1292775 0.1293135 0.1293460

$cvup
 [1] 1.319123 1.319641 1.314717 1.303202 1.291021 1.280179 1.269998 1.259034
 [9] 1.248892 1.239105 1.228665 1.217257 1.208585 1.200644 1.192767 1.186468
[17] 1.181412 1.177385 1.174237 1.171838 1.170405 1.170394 1.171439 1.173532
[25] 1.176079 1.179018 1.182280 1.185566 1.188862 1.192102 1.195021 1.197449
[33] 1.199698 1.201644 1.203451 1.205141 1.206732 1.208232 1.209639 1.210950
[41] 1.212158 1.213283 1.214341 1.215297 1.216183 1.217003 1.217755 1.218458
[49] 1.219113 1.219762 1.220375 1.220876 1.221293 1.221664 1.222003 1.222318
[57] 1.222608 1.222883 1.223107 1.223330 1.223535 1.223721 1.223891

$cvlo
 [1] 1.0221003 1.0240278 1.0210534 1.0136670 1.0059441 0.9992192 0.9932191
 [8] 0.9874701 0.9813149 0.9741120 0.9657590 0.9557566 0.9473180 0.9402323
[15] 0.9341263 0.9289920 0.9246613 0.9210022 0.9179595 0.9155100 0.9143500
[22] 0.9147973 0.9160961 0.9184573 0.9211321 0.9241623 0.9272827 0.9303098
[29] 0.9333047 0.9361300 0.9387920 0.9412873 0.9435945 0.9455026 0.9472584
[36] 0.9488926 0.9504252 0.9518561 0.9531877 0.9544162 0.9555482 0.9565934
[43] 0.9575622 0.9584446 0.9592639 0.9600192 0.9607064 0.9613463 0.9619363
[50] 0.9624916 0.9630061 0.9633570 0.9636175 0.9638398 0.9640413 0.9642338
[57] 0.9644121 0.9645924 0.9647161 0.9648556 0.9649797 0.9650940 0.9651992

$nzero
 s0  s1  s2  s3  s4  s5  s6  s7  s8  s9 s10 s11 s12 s13 s14 s15 s16 s17 s18 s19 
  0   3   3   3   3   3   3   4   4   5   5   5   5   5   5   5   5   5   5   5 
s20 s21 s22 s23 s24 s25 s26 s27 s28 s29 s30 s31 s32 s33 s34 s35 s36 s37 s38 s39 
  5   6   6   6   6   6   7   7   7   7   7   7   7   7   7   7   8   8   8   8 
s40 s41 s42 s43 s44 s45 s46 s47 s48 s49 s50 s51 s52 s53 s54 s55 s56 s57 s58 s59 
  9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9 
s60 s61 s62 
  9   9   9 

$name
                 mse 
"Mean-Squared Error" 

$glmnet.fit

Call:  glmnet(x = as.matrix(training_1), y = trainY, family = "gaussian") 

      Df    %Dev    Lambda
 [1,]  0 0.00000 0.3350000
 [2,]  3 0.01892 0.3052000
 [3,]  3 0.05056 0.2781000
 [4,]  3 0.07684 0.2534000
 [5,]  3 0.09865 0.2309000
 [6,]  3 0.11680 0.2104000
 [7,]  3 0.13180 0.1917000
 [8,]  4 0.14530 0.1747000
 [9,]  4 0.15670 0.1592000
[10,]  5 0.17030 0.1450000
[11,]  5 0.18290 0.1321000
[12,]  5 0.19330 0.1204000
[13,]  5 0.20200 0.1097000
[14,]  5 0.20920 0.0999500
[15,]  5 0.21510 0.0910700
[16,]  5 0.22010 0.0829800
[17,]  5 0.22420 0.0756100
[18,]  5 0.22760 0.0688900
[19,]  5 0.23050 0.0627700
[20,]  5 0.23280 0.0572000
[21,]  5 0.23480 0.0521100
[22,]  6 0.23650 0.0474800
[23,]  6 0.23800 0.0432700
[24,]  6 0.23920 0.0394200
[25,]  6 0.24030 0.0359200
[26,]  6 0.24110 0.0327300
[27,]  7 0.24240 0.0298200
[28,]  7 0.24400 0.0271700
[29,]  7 0.24530 0.0247600
[30,]  7 0.24640 0.0225600
[31,]  7 0.24730 0.0205600
[32,]  7 0.24800 0.0187300
[33,]  7 0.24860 0.0170700
[34,]  7 0.24910 0.0155500
[35,]  7 0.24950 0.0141700
[36,]  7 0.24990 0.0129100
[37,]  8 0.25020 0.0117600
[38,]  8 0.25050 0.0107200
[39,]  8 0.25080 0.0097650
[40,]  8 0.25100 0.0088980
[41,]  9 0.25120 0.0081070
[42,]  9 0.25130 0.0073870
[43,]  9 0.25140 0.0067310
[44,]  9 0.25150 0.0061330
[45,]  9 0.25160 0.0055880
[46,]  9 0.25170 0.0050920
[47,]  9 0.25170 0.0046390
[48,]  9 0.25180 0.0042270
[49,]  9 0.25180 0.0038520
[50,]  9 0.25190 0.0035090
[51,]  9 0.25190 0.0031980
[52,]  9 0.25190 0.0029140
[53,]  9 0.25190 0.0026550
[54,]  9 0.25190 0.0024190
[55,]  9 0.25200 0.0022040
[56,]  9 0.25200 0.0020080
[57,]  9 0.25200 0.0018300
[58,]  9 0.25200 0.0016670
[59,]  9 0.25200 0.0015190
[60,]  9 0.25200 0.0013840
[61,]  9 0.25200 0.0012610
[62,]  9 0.25200 0.0011490
[63,]  9 0.25200 0.0010470
[64,]  9 0.25200 0.0009541
[65,]  9 0.25200 0.0008693

$lambda.min
[1] 0.05211468

$lambda.1se
[1] 0.2781204

attr(,"class")
[1] "cv.glmnet"

$models.fitted[[2]]
$lambda
 [1] 0.2576272052 0.2347403143 0.2138866318 0.1948855329 0.1775724393
 [6] 0.1617973932 0.1474237587 0.1343270383 0.1223937944 0.1115206671
[11] 0.1016134784 0.0925864170 0.0843612949 0.0768668700 0.0700382291
[16] 0.0638162259 0.0581469682 0.0529813518 0.0482746346 0.0439860491
[21] 0.0400784497 0.0365179907 0.0332738331 0.0303178774 0.0276245208
[26] 0.0251704345 0.0229343624 0.0208969369 0.0190405106 0.0173490041
[31] 0.0158077664 0.0144034480 0.0131238854 0.0119579956 0.0108956802
[36] 0.0099277380 0.0090457851 0.0082421825 0.0075099697 0.0068428047
[41] 0.0062349088 0.0056810167 0.0051763308 0.0047164799 0.0042974808
[46] 0.0039157044 0.0035678441 0.0032508866 0.0029620868 0.0026989432
[51] 0.0024591765 0.0022407100 0.0020416515 0.0018602768 0.0016950150
[56] 0.0015444345 0.0014072312 0.0012822167 0.0011683081 0.0010645188
[61] 0.0009699499 0.0008837822

$cvm
 [1] 1.0193890 1.0206737 1.0187454 1.0128485 1.0063380 1.0000077 0.9941295
 [8] 0.9879911 0.9816970 0.9762128 0.9721782 0.9699116 0.9688261 0.9686737
[15] 0.9697535 0.9727381 0.9767684 0.9809112 0.9854143 0.9900350 0.9947992
[22] 1.0004479 1.0064029 1.0126150 1.0186856 1.0244149 1.0297851 1.0345482
[29] 1.0388061 1.0428233 1.0463685 1.0495586 1.0524386 1.0550940 1.0575701
[36] 1.0598734 1.0619954 1.0639591 1.0657745 1.0674508 1.0689375 1.0702232
[43] 1.0714150 1.0725131 1.0735200 1.0744367 1.0752916 1.0760645 1.0767596
[50] 1.0774099 1.0780050 1.0785157 1.0789579 1.0793509 1.0797238 1.0800779
[57] 1.0803763 1.0806615 1.0809128 1.0811314 1.0813496 1.0815404

$cvsd
 [1] 0.1169343 0.1173638 0.1173544 0.1184963 0.1192615 0.1190444 0.1183406
 [8] 0.1173577 0.1162086 0.1151823 0.1142939 0.1134235 0.1127108 0.1120990
[15] 0.1115532 0.1109913 0.1104222 0.1100619 0.1099154 0.1099646 0.1102156
[22] 0.1110962 0.1120843 0.1131655 0.1142901 0.1154163 0.1165235 0.1174469
[29] 0.1182046 0.1188957 0.1195249 0.1201614 0.1207885 0.1213875 0.1219495
[36] 0.1224758 0.1229673 0.1234225 0.1238473 0.1242468 0.1246533 0.1250727
[43] 0.1254564 0.1258077 0.1261303 0.1264251 0.1266933 0.1269410 0.1271661
[50] 0.1273753 0.1275632 0.1277191 0.1278522 0.1279684 0.1280796 0.1281825
[57] 0.1282761 0.1283594 0.1284387 0.1285117 0.1285799 0.1286414

$cvup
 [1] 1.136323 1.138037 1.136100 1.131345 1.125599 1.119052 1.112470 1.105349
 [9] 1.097906 1.091395 1.086472 1.083335 1.081537 1.080773 1.081307 1.083729
[17] 1.087191 1.090973 1.095330 1.100000 1.105015 1.111544 1.118487 1.125781
[25] 1.132976 1.139831 1.146309 1.151995 1.157011 1.161719 1.165893 1.169720
[33] 1.173227 1.176481 1.179520 1.182349 1.184963 1.187382 1.189622 1.191698
[41] 1.193591 1.195296 1.196871 1.198321 1.199650 1.200862 1.201985 1.203006
[49] 1.203926 1.204785 1.205568 1.206235 1.206810 1.207319 1.207803 1.208260
[57] 1.208652 1.209021 1.209351 1.209643 1.209929 1.210182

$cvlo
 [1] 0.9024546 0.9033099 0.9013909 0.8943522 0.8870765 0.8809633 0.8757889
 [8] 0.8706334 0.8654883 0.8610305 0.8578843 0.8564881 0.8561153 0.8565747
[15] 0.8582003 0.8617468 0.8663461 0.8708493 0.8754989 0.8800704 0.8845836
[22] 0.8893517 0.8943186 0.8994495 0.9043955 0.9089986 0.9132616 0.9171012
[29] 0.9206015 0.9239276 0.9268436 0.9293973 0.9316500 0.9337065 0.9356206
[36] 0.9373976 0.9390280 0.9405366 0.9419272 0.9432040 0.9442842 0.9451505
[43] 0.9459586 0.9467054 0.9473897 0.9480116 0.9485984 0.9491234 0.9495936
[50] 0.9500346 0.9504418 0.9507966 0.9511057 0.9513826 0.9516442 0.9518954
[57] 0.9521002 0.9523021 0.9524741 0.9526196 0.9527697 0.9528990

$nzero
 s0  s1  s2  s3  s4  s5  s6  s7  s8  s9 s10 s11 s12 s13 s14 s15 s16 s17 s18 s19 
  0   2   2   3   4   4   4   4   4   5   5   5   5   5   6   6   6   6   6   6 
s20 s21 s22 s23 s24 s25 s26 s27 s28 s29 s30 s31 s32 s33 s34 s35 s36 s37 s38 s39 
  6   6   6   6   6   7   8   8   8   8   9   9   9   9   9   9   9   9   9   9 
s40 s41 s42 s43 s44 s45 s46 s47 s48 s49 s50 s51 s52 s53 s54 s55 s56 s57 s58 s59 
  9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9 
s60 s61 
  9   9 

$name
                 mse 
"Mean-Squared Error" 

$glmnet.fit

Call:  glmnet(x = as.matrix(training_1), y = trainY, family = "gaussian") 

      Df    %Dev    Lambda
 [1,]  0 0.00000 0.2576000
 [2,]  2 0.01339 0.2347000
 [3,]  2 0.03147 0.2139000
 [4,]  3 0.04921 0.1949000
 [5,]  4 0.06632 0.1776000
 [6,]  4 0.08253 0.1618000
 [7,]  4 0.09599 0.1474000
 [8,]  4 0.10720 0.1343000
 [9,]  4 0.11640 0.1224000
[10,]  5 0.12440 0.1115000
[11,]  5 0.13130 0.1016000
[12,]  5 0.13710 0.0925900
[13,]  5 0.14190 0.0843600
[14,]  5 0.14580 0.0768700
[15,]  6 0.15060 0.0700400
[16,]  6 0.15470 0.0638200
[17,]  6 0.15810 0.0581500
[18,]  6 0.16090 0.0529800
[19,]  6 0.16320 0.0482700
[20,]  6 0.16520 0.0439900
[21,]  6 0.16680 0.0400800
[22,]  6 0.16810 0.0365200
[23,]  6 0.16920 0.0332700
[24,]  6 0.17010 0.0303200
[25,]  6 0.17090 0.0276200
[26,]  7 0.17210 0.0251700
[27,]  8 0.17360 0.0229300
[28,]  8 0.17500 0.0209000
[29,]  8 0.17610 0.0190400
[30,]  8 0.17710 0.0173500
[31,]  9 0.17780 0.0158100
[32,]  9 0.17850 0.0144000
[33,]  9 0.17910 0.0131200
[34,]  9 0.17960 0.0119600
[35,]  9 0.18000 0.0109000
[36,]  9 0.18030 0.0099280
[37,]  9 0.18060 0.0090460
[38,]  9 0.18080 0.0082420
[39,]  9 0.18100 0.0075100
[40,]  9 0.18120 0.0068430
[41,]  9 0.18130 0.0062350
[42,]  9 0.18140 0.0056810
[43,]  9 0.18150 0.0051760
[44,]  9 0.18160 0.0047160
[45,]  9 0.18160 0.0042970
[46,]  9 0.18170 0.0039160
[47,]  9 0.18170 0.0035680
[48,]  9 0.18180 0.0032510
[49,]  9 0.18180 0.0029620
[50,]  9 0.18180 0.0026990
[51,]  9 0.18190 0.0024590
[52,]  9 0.18190 0.0022410
[53,]  9 0.18190 0.0020420
[54,]  9 0.18190 0.0018600
[55,]  9 0.18190 0.0016950
[56,]  9 0.18190 0.0015440
[57,]  9 0.18190 0.0014070
[58,]  9 0.18190 0.0012820
[59,]  9 0.18190 0.0011680
[60,]  9 0.18190 0.0010650
[61,]  9 0.18190 0.0009699
[62,]  9 0.18190 0.0008838
[63,]  9 0.18190 0.0008053
[64,]  9 0.18190 0.0007337
[65,]  9 0.18190 0.0006685

$lambda.min
[1] 0.07686687

$lambda.1se
[1] 0.2576272

attr(,"class")
[1] "cv.glmnet"


$models.trimmed
list()

$y.true
 [1] -0.83775479 -1.14851936  0.33924804 -1.43829554 -1.18967101 -0.41107042
 [7]  0.25222901  1.59426588 -2.06595813  1.02968473 -0.95343363  1.01458714
[13]  0.28231774 -1.68147706  0.57755498  0.63981125 -0.71844715 -0.78385149
[19]  1.12153585  1.41082054  2.31689124  0.05806684 -0.83275425 -0.19727404
[25]  0.75305102 -0.88536477  1.04273063  0.09621294  0.56114745 -0.01265842
[31]  1.08066531 -0.73433879 -1.15332488 -2.51413424 -1.83666523 -0.37910482
[37] -0.71644425  0.72703906  0.15528332  0.26264430 -0.25781196 -1.84640000
[43] -0.60635531 -1.48151667 -0.20135660 -2.24649499 -0.48603835 -0.37209404
[49]  0.08876528  1.03353537 -1.21352291  0.01505783 -1.94896593 -0.66214950
[55] -0.33336863 -0.22916520 -1.26203427 -1.32576006 -0.91152770 -0.88164621
[61]  0.99116568 -1.07941484 -2.71094480 -2.07449018 -0.23715966 -0.79074956
[67]  0.35391033  0.91415362 -0.51173620  0.73567712 -1.28623536  0.32291011
[73]  1.45426190 -0.82834478  0.37822243 -0.38892281 -1.84089825  0.11562563
[79]  0.38670154  0.35751608

$conv.scores
$conv.scores$ranks
     [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10]
[1,] "5"  "10" "8"  "7"  "4"  "3"  "2"  "1"  "9"  "6"  
[2,] "5"  "10" "8"  "7"  "4"  "3"  "2"  "1"  "9"  "6"  
[3,] "1"  "9"  "6"  "3"  "2"  "4"  "7"  "8"  "10" "5"  
[4,] "5"  "10" "4"  "7"  "8"  "2"  "3"  "9"  "6"  "1"  

$conv.scores$weights
          [,1]      [,2]      [,3]      [,4]      [,5]      [,6]      [,7]
[1,] 0.9770869 0.9770843 0.9747954 0.9747930 0.9697949 0.9658363 0.9653150
[2,] 1.2499899 1.2499870 1.2460446 1.2460409 1.2393659 1.2319513 1.2314571
[3,] 0.7616848 0.7609513 0.7375416 0.7373028 0.7370965 0.7358816 0.7355996
[4,] 0.7672880 0.7672870 0.7666914 0.7659642 0.7659635 0.7627853 0.7625705
          [,8]      [,9]     [,10]
[1,] 0.9618191 0.9606570 0.9594676
[2,] 1.2300032 1.2289072 1.2220636
[3,] 0.7355948 0.7346079 0.7346060
[4,] 0.7578440 0.7577510 0.7568872


$importance
         [,1]
A1 0.12302406
A2 0.00000000
A3 0.06926666
A4 0.06271640
A5 0.13859420
A6 0.06824081
A7 0.04522673
A8 0.07795560
A9 0.00000000
y  0.23927699

attr(,"class")
[1] "bagging"
$y.new
 [1]  0.04447873 -0.12871182 -0.23723027 -0.02300019 -0.03851816 -0.44884138
 [7] -0.80601068 -0.71891720  0.03472918  0.05418729 -0.06476798 -0.17292143
[13] -0.61339668 -0.26723470 -0.28321384 -0.67176706 -0.51268009 -0.61530304
[19] -0.51708935 -0.81701762

$y.se
 [1] 0.266523429 0.169630890 0.055204841 0.308323909 0.044965845 0.313516781
 [7] 0.343833731 0.129416122 0.036851445 0.153187885 0.219717119 0.004629453
[13] 0.015876657 0.015483059 0.138680946 0.079378230 0.062891881 0.051941293
[19] 0.325012999 0.293268186

$predicted.matrix
              [,1]         [,2]
 [1,]  0.311002160 -0.222044699
 [2,]  0.040919067 -0.298342714
 [3,] -0.292435110 -0.182025429
 [4,]  0.285323717 -0.331324101
 [5,] -0.083484002  0.006447688
 [6,] -0.762358161 -0.135324600
 [7,] -1.149844416 -0.462176953
 [8,] -0.848333322 -0.589501079
 [9,] -0.002122263  0.071580627
[10,] -0.099000596  0.207375173
[11,] -0.284485102  0.154949137
[12,] -0.168291977 -0.177550882
[13,] -0.597520023 -0.629273337
[14,] -0.282717763 -0.251751645
[15,] -0.421894781 -0.144532889
[16,] -0.592388826 -0.751145287
[17,] -0.575571975 -0.449788213
[18,] -0.667244329 -0.563361743
[19,] -0.842102350 -0.192076351
[20,] -1.110285808 -0.523749436

attr(,"class")
[1] "BaggingPrediction"
Iter  1 
Iter  2 
$family
[1] "binomial"

$M
[1] 2

$predictor.subset
[1] 9

$subspace.size
[1] 10

$validation.metric
[1] "accuracy"    "sensitivity" "specificity" "auc"         "kia"        

$boot.scale
[1] 1

$distance
[1] "Spearman"

$models.fitted
$models.fitted[[1]]
$lambda
 [1] 0.2317925169 0.2112007086 0.1924382197 0.1753425387 0.1597655908
 [6] 0.1455724560 0.1326402002 0.1208568104 0.1101202245 0.1003374472
[11] 0.0914237450 0.0833019114 0.0759015992 0.0691587103 0.0630148411
[16] 0.0574167763 0.0523160282 0.0476684164 0.0434336857 0.0395751567
[21] 0.0360594089 0.0328559904 0.0299371548 0.0272776204 0.0248543518
[26] 0.0226463597 0.0206345195 0.0188014057 0.0171311406 0.0156092573
[31] 0.0142225739 0.0129590796 0.0118078307 0.0107588556 0.0098030685
[36] 0.0089321909 0.0081386797 0.0074156618 0.0067568749 0.0061566127
[41] 0.0056096762 0.0051113280 0.0046572517 0.0042435143 0.0038665322
[46] 0.0035230402 0.0032100630 0.0029248898 0.0026650507 0.0024282949
[51] 0.0022125719 0.0020160131 0.0018369161 0.0016737295 0.0015250400
[56] 0.0013895596 0.0012661150 0.0011536368 0.0010511509 0.0009577696
[61] 0.0008726839 0.0007951571 0.0007245175 0.0006601534 0.0006015072

$cvm
 [1] 1.390173 1.367903 1.342192 1.320709 1.303998 1.286680 1.260045 1.227663
 [9] 1.197318 1.173190 1.154497 1.140684 1.131308 1.122471 1.114412 1.106902
[17] 1.100074 1.093191 1.088555 1.084766 1.082259 1.081368 1.080179 1.079510
[25] 1.079812 1.080260 1.081292 1.083419 1.086438 1.090247 1.094154 1.098015
[33] 1.102337 1.107088 1.112051 1.116835 1.121787 1.126853 1.131967 1.137072
[41] 1.142120 1.147071 1.151894 1.156561 1.161054 1.165352 1.169441 1.173328
[49] 1.176984 1.180454 1.183721 1.186783 1.189647 1.192315 1.194777 1.197087
[57] 1.199213 1.201198 1.203039 1.204717 1.206281 1.207707 1.209035 1.210263
[65] 1.211390

$cvsd
 [1] 0.01364905 0.01398154 0.01353940 0.01589334 0.01981455 0.02347824
 [7] 0.02760413 0.03108016 0.03420529 0.03784397 0.04209536 0.04684658
[13] 0.05266356 0.05914513 0.06560504 0.07180801 0.07834203 0.08502745
[19] 0.09142801 0.09762097 0.10356497 0.10925687 0.11463930 0.11984261
[25] 0.12470843 0.12893245 0.13303690 0.13708268 0.14105121 0.14497771
[31] 0.14880315 0.15230877 0.15572848 0.15906025 0.16227773 0.16532385
[37] 0.16825828 0.17107872 0.17378063 0.17636080 0.17881741 0.18114930
[43] 0.18335678 0.18544105 0.18740365 0.18924620 0.19097283 0.19258829
[49] 0.19409078 0.19549721 0.19680761 0.19802528 0.19915447 0.20019774
[55] 0.20115984 0.20204958 0.20286787 0.20362746 0.20432956 0.20497083
[61] 0.20556062 0.20609455 0.20659363 0.20705534 0.20747840

$cvup
 [1] 1.403822 1.381884 1.355731 1.336602 1.323812 1.310158 1.287649 1.258743
 [9] 1.231523 1.211034 1.196592 1.187531 1.183972 1.181616 1.180017 1.178710
[17] 1.178416 1.178219 1.179983 1.182387 1.185824 1.190625 1.194818 1.199353
[25] 1.204521 1.209193 1.214329 1.220501 1.227489 1.235225 1.242957 1.250324
[33] 1.258066 1.266148 1.274328 1.282159 1.290046 1.297932 1.305748 1.313433
[41] 1.320937 1.328220 1.335251 1.342002 1.348458 1.354598 1.360414 1.365917
[49] 1.371075 1.375952 1.380529 1.384808 1.388801 1.392513 1.395937 1.399137
[57] 1.402081 1.404826 1.407369 1.409688 1.411842 1.413802 1.415629 1.417318
[65] 1.418868

$cvlo
 [1] 1.3765243 1.3539213 1.3286522 1.3048155 1.2841830 1.2632013 1.2324406
 [8] 1.1965829 1.1631129 1.1353460 1.1124013 1.0938379 1.0786444 1.0633256
[15] 1.0488068 1.0350944 1.0217322 1.0081639 0.9971273 0.9871455 0.9786940
[22] 0.9721111 0.9655396 0.9596674 0.9551040 0.9513277 0.9482551 0.9463361
[29] 0.9453867 0.9452695 0.9453508 0.9457060 0.9466089 0.9480277 0.9497729
[36] 0.9515116 0.9535290 0.9557743 0.9581863 0.9607113 0.9633024 0.9659219
[43] 0.9685371 0.9711203 0.9736508 0.9761054 0.9784679 0.9807400 0.9828936
[50] 0.9849572 0.9869136 0.9887573 0.9904921 0.9921174 0.9936171 0.9950374
[57] 0.9963455 0.9975708 0.9987099 0.9997466 1.0007208 1.0016128 1.0024416
[64] 1.0032073 1.0039113

$nzero
 s0  s1  s2  s3  s4  s5  s6  s7  s8  s9 s10 s11 s12 s13 s14 s15 s16 s17 s18 s19 
  0   1   1   1   2   2   3   3   3   3   3   4   5   5   5   6   6   7   8   8 
s20 s21 s22 s23 s24 s25 s26 s27 s28 s29 s30 s31 s32 s33 s34 s35 s36 s37 s38 s39 
  8   8   8   8   8   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9 
s40 s41 s42 s43 s44 s45 s46 s47 s48 s49 s50 s51 s52 s53 s54 s55 s56 s57 s58 s59 
  9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9 
s60 s61 s62 s63 s64 
  9   9   9   9   9 

$name
           deviance 
"Binomial Deviance" 

$glmnet.fit

Call:  glmnet(x = as.matrix(training_1), y = as.factor(trainY), family = "binomial") 

      Df      %Dev    Lambda
 [1,]  0 8.023e-16 0.2318000
 [2,]  1 2.647e-02 0.2112000
 [3,]  1 4.860e-02 0.1924000
 [4,]  1 6.724e-02 0.1753000
 [5,]  2 8.603e-02 0.1598000
 [6,]  2 1.110e-01 0.1456000
 [7,]  3 1.357e-01 0.1326000
 [8,]  3 1.662e-01 0.1209000
 [9,]  3 1.923e-01 0.1101000
[10,]  3 2.148e-01 0.1003000
[11,]  3 2.343e-01 0.0914200
[12,]  4 2.520e-01 0.0833000
[13,]  5 2.698e-01 0.0759000
[14,]  5 2.874e-01 0.0691600
[15,]  5 3.027e-01 0.0630100
[16,]  6 3.181e-01 0.0574200
[17,]  6 3.321e-01 0.0523200
[18,]  7 3.460e-01 0.0476700
[19,]  8 3.594e-01 0.0434300
[20,]  8 3.716e-01 0.0395800
[21,]  8 3.824e-01 0.0360600
[22,]  8 3.920e-01 0.0328600
[23,]  8 4.004e-01 0.0299400
[24,]  8 4.078e-01 0.0272800
[25,]  8 4.143e-01 0.0248500
[26,]  9 4.210e-01 0.0226500
[27,]  9 4.270e-01 0.0206300
[28,]  9 4.323e-01 0.0188000
[29,]  9 4.369e-01 0.0171300
[30,]  9 4.409e-01 0.0156100
[31,]  9 4.444e-01 0.0142200
[32,]  9 4.475e-01 0.0129600
[33,]  9 4.501e-01 0.0118100
[34,]  9 4.524e-01 0.0107600
[35,]  9 4.544e-01 0.0098030
[36,]  9 4.561e-01 0.0089320
[37,]  9 4.576e-01 0.0081390
[38,]  9 4.588e-01 0.0074160
[39,]  9 4.599e-01 0.0067570
[40,]  9 4.608e-01 0.0061570
[41,]  9 4.616e-01 0.0056100
[42,]  9 4.622e-01 0.0051110
[43,]  9 4.628e-01 0.0046570
[44,]  9 4.633e-01 0.0042440
[45,]  9 4.637e-01 0.0038670
[46,]  9 4.640e-01 0.0035230
[47,]  9 4.643e-01 0.0032100
[48,]  9 4.645e-01 0.0029250
[49,]  9 4.647e-01 0.0026650
[50,]  9 4.649e-01 0.0024280
[51,]  9 4.651e-01 0.0022130
[52,]  9 4.652e-01 0.0020160
[53,]  9 4.653e-01 0.0018370
[54,]  9 4.654e-01 0.0016740
[55,]  9 4.654e-01 0.0015250
[56,]  9 4.655e-01 0.0013900
[57,]  9 4.655e-01 0.0012660
[58,]  9 4.656e-01 0.0011540
[59,]  9 4.656e-01 0.0010510
[60,]  9 4.656e-01 0.0009578
[61,]  9 4.657e-01 0.0008727
[62,]  9 4.657e-01 0.0007952
[63,]  9 4.657e-01 0.0007245
[64,]  9 4.657e-01 0.0006602
[65,]  9 4.657e-01 0.0006015
[66,]  9 4.657e-01 0.0005481

$lambda.min
[1] 0.02727762

$lambda.1se
[1] 0.1101202

attr(,"class")
[1] "cv.glmnet"

$models.fitted[[2]]
$lambda
 [1] 0.2245280966 0.2045816393 0.1864071703 0.1698472711 0.1547585076
 [6] 0.1410101884 0.1284832319 0.1170691357 0.1066690363 0.0971928531
[11] 0.0885585079 0.0806912141 0.0735228291 0.0669912636 0.0610399444
[16] 0.0556173239 0.0506764342 0.0461744794 0.0420724659 0.0383348640
[21] 0.0349293004 0.0318262776 0.0289989188 0.0264227348 0.0240754118
[26] 0.0219366186 0.0199878299 0.0182121662 0.0165942475 0.0151200603
[31] 0.0137768358 0.0125529397 0.0114377711 0.0104216711 0.0094958385
[36] 0.0086522544 0.0078836120 0.0071832536 0.0065451132 0.0059636633
[41] 0.0054338679 0.0049511381 0.0045112926 0.0041105218 0.0037453544
[46] 0.0034126274 0.0031094590 0.0028332232 0.0025815275 0.0023521917
[51] 0.0021432295 0.0019528309 0.0017793468 0.0016212745 0.0014772449
[56] 0.0013460106 0.0012264347 0.0011174816 0.0010182076 0.0009277529
[61] 0.0008453339 0.0007702367 0.0007018110

$cvm
 [1] 1.436664 1.423640 1.398594 1.377528 1.354445 1.328559 1.295393 1.261669
 [9] 1.231858 1.206731 1.185966 1.167854 1.152180 1.137593 1.125069 1.112630
[17] 1.101179 1.088093 1.075992 1.066618 1.057952 1.051075 1.046205 1.043135
[25] 1.041544 1.040995 1.041384 1.042551 1.044292 1.046487 1.049111 1.052060
[33] 1.055096 1.058048 1.061124 1.064315 1.067570 1.070847 1.074098 1.077316
[41] 1.080466 1.083330 1.085997 1.088573 1.091049 1.093416 1.095671 1.097687
[49] 1.099398 1.101037 1.102592 1.104056 1.105434 1.106698 1.107886 1.109012
[57] 1.110061 1.111036 1.111940 1.112778 1.113512 1.114225 1.114870

$cvsd
 [1] 0.02516253 0.02666794 0.02625097 0.03000431 0.03516634 0.04126805
 [7] 0.04801390 0.05488220 0.06169901 0.06846521 0.07524027 0.08207557
[13] 0.08905053 0.09601892 0.10323588 0.11050499 0.11712924 0.12123882
[19] 0.12540364 0.12970617 0.13410626 0.13848489 0.14277816 0.14706867
[25] 0.15137164 0.15567526 0.15992841 0.16416856 0.16838316 0.17255072
[31] 0.17663619 0.18062496 0.18450766 0.18834344 0.19206159 0.19563984
[37] 0.19907234 0.20235475 0.20547509 0.20844955 0.21126974 0.21386270
[43] 0.21626923 0.21853578 0.22066576 0.22266305 0.22453187 0.22618606
[49] 0.22757133 0.22888084 0.23010467 0.23124629 0.23230883 0.23327641
[55] 0.23418586 0.23503541 0.23582198 0.23654890 0.23721987 0.23783853
[61] 0.23838269 0.23890589 0.23938272

$cvup
 [1] 1.461826 1.450308 1.424845 1.407533 1.389612 1.369827 1.343407 1.316551
 [9] 1.293557 1.275196 1.261207 1.249930 1.241230 1.233612 1.228305 1.223135
[17] 1.218309 1.209332 1.201396 1.196324 1.192058 1.189560 1.188983 1.190204
[25] 1.192916 1.196671 1.201313 1.206720 1.212675 1.219038 1.225747 1.232685
[33] 1.239604 1.246392 1.253186 1.259955 1.266642 1.273201 1.279573 1.285766
[41] 1.291735 1.297192 1.302266 1.307109 1.311714 1.316079 1.320203 1.323873
[49] 1.326969 1.329918 1.332697 1.335303 1.337743 1.339975 1.342072 1.344047
[57] 1.345883 1.347585 1.349160 1.350616 1.351895 1.353130 1.354252

$cvlo
 [1] 1.4115010 1.3969717 1.3723426 1.3475241 1.3192791 1.2872912 1.2473788
 [8] 1.2067869 1.1701591 1.1382657 1.1107262 1.0857786 1.0631293 1.0415745
[15] 1.0218331 1.0021252 0.9840502 0.9668547 0.9505887 0.9369114 0.9238453
[22] 0.9125904 0.9034264 0.8960662 0.8901723 0.8853201 0.8814561 0.8783828
[29] 0.8759088 0.8739362 0.8724747 0.8714346 0.8705888 0.8697047 0.8690629
[36] 0.8686748 0.8684972 0.8684918 0.8686232 0.8688664 0.8691960 0.8694670
[43] 0.8697277 0.8700376 0.8703830 0.8707525 0.8711389 0.8715005 0.8718262
[50] 0.8721565 0.8724873 0.8728102 0.8731254 0.8734217 0.8737002 0.8739763
[57] 0.8742389 0.8744870 0.8747204 0.8749393 0.8751295 0.8753186 0.8754868

$nzero
 s0  s1  s2  s3  s4  s5  s6  s7  s8  s9 s10 s11 s12 s13 s14 s15 s16 s17 s18 s19 
  0   1   1   1   4   4   4   4   4   4   6   6   6   6   7   7   8   8   8   8 
s20 s21 s22 s23 s24 s25 s26 s27 s28 s29 s30 s31 s32 s33 s34 s35 s36 s37 s38 s39 
  8   8   8   8   8   8   8   9   9   9   9   9   9   9   9   9   9   9   9   9 
s40 s41 s42 s43 s44 s45 s46 s47 s48 s49 s50 s51 s52 s53 s54 s55 s56 s57 s58 s59 
  9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9   9 
s60 s61 s62 
  9   9   9 

$name
           deviance 
"Binomial Deviance" 

$glmnet.fit

Call:  glmnet(x = as.matrix(training_1), y = as.factor(trainY), family = "binomial") 

      Df      %Dev    Lambda
 [1,]  0 6.418e-16 0.2245000
 [2,]  1 2.484e-02 0.2046000
 [3,]  1 4.563e-02 0.1864000
 [4,]  1 6.312e-02 0.1698000
 [5,]  4 9.377e-02 0.1548000
 [6,]  4 1.284e-01 0.1410000
 [7,]  4 1.582e-01 0.1285000
 [8,]  4 1.840e-01 0.1171000
 [9,]  4 2.065e-01 0.1067000
[10,]  4 2.260e-01 0.0971900
[11,]  6 2.452e-01 0.0885600
[12,]  6 2.627e-01 0.0806900
[13,]  6 2.780e-01 0.0735200
[14,]  6 2.913e-01 0.0669900
[15,]  7 3.064e-01 0.0610400
[16,]  7 3.204e-01 0.0556200
[17,]  8 3.340e-01 0.0506800
[18,]  8 3.471e-01 0.0461700
[19,]  8 3.586e-01 0.0420700
[20,]  8 3.687e-01 0.0383300
[21,]  8 3.777e-01 0.0349300
[22,]  8 3.855e-01 0.0318300
[23,]  8 3.924e-01 0.0290000
[24,]  8 3.984e-01 0.0264200
[25,]  8 4.037e-01 0.0240800
[26,]  8 4.083e-01 0.0219400
[27,]  8 4.124e-01 0.0199900
[28,]  9 4.162e-01 0.0182100
[29,]  9 4.198e-01 0.0165900
[30,]  9 4.229e-01 0.0151200
[31,]  9 4.256e-01 0.0137800
[32,]  9 4.279e-01 0.0125500
[33,]  9 4.300e-01 0.0114400
[34,]  9 4.317e-01 0.0104200
[35,]  9 4.332e-01 0.0094960
[36,]  9 4.345e-01 0.0086520
[37,]  9 4.356e-01 0.0078840
[38,]  9 4.366e-01 0.0071830
[39,]  9 4.374e-01 0.0065450
[40,]  9 4.381e-01 0.0059640
[41,]  9 4.387e-01 0.0054340
[42,]  9 4.392e-01 0.0049510
[43,]  9 4.396e-01 0.0045110
[44,]  9 4.400e-01 0.0041110
[45,]  9 4.403e-01 0.0037450
[46,]  9 4.406e-01 0.0034130
[47,]  9 4.408e-01 0.0031090
[48,]  9 4.410e-01 0.0028330
[49,]  9 4.411e-01 0.0025820
[50,]  9 4.413e-01 0.0023520
[51,]  9 4.414e-01 0.0021430
[52,]  9 4.415e-01 0.0019530
[53,]  9 4.416e-01 0.0017790
[54,]  9 4.416e-01 0.0016210
[55,]  9 4.417e-01 0.0014770
[56,]  9 4.417e-01 0.0013460
[57,]  9 4.418e-01 0.0012260
[58,]  9 4.418e-01 0.0011170
[59,]  9 4.418e-01 0.0010180
[60,]  9 4.418e-01 0.0009278
[61,]  9 4.419e-01 0.0008453
[62,]  9 4.419e-01 0.0007702
[63,]  9 4.419e-01 0.0007018
[64,]  9 4.419e-01 0.0006395
[65,]  9 4.419e-01 0.0005827

$lambda.min
[1] 0.02193662

$lambda.1se
[1] 0.08855851

attr(,"class")
[1] "cv.glmnet"


$models.trimmed
list()

$y.true
 [1] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
[39] 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
[77] 1 1 1 1
Levels: 0 1

$conv.scores
$conv.scores$ranks
     [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10]
[1,] "3"  "4"  "5"  "6"  "7"  "10" "1"  "2"  "9"  "8"  
[2,] "1"  "2"  "5"  "6"  "8"  "3"  "4"  "7"  "9"  "10" 
[3,] "3"  "4"  "7"  "10" "5"  "6"  "9"  "1"  "2"  "8"  
[4,] "1"  "2"  "7"  "3"  "5"  "6"  "9"  "4"  "10" "8"  
[5,] "5"  "6"  "3"  "4"  "7"  "10" "1"  "2"  "9"  "8"  

$conv.scores$weights
          [,1]      [,2]      [,3]      [,4]      [,5]      [,6]      [,7]
[1,] 0.7500000 0.7500000 0.7500000 0.7500000 0.7500000 0.7500000 0.7187500
[2,] 0.8666667 0.8666667 0.8666667 0.8666667 0.8666667 0.8000000 0.8000000
[3,] 0.7058824 0.7058824 0.7058824 0.7058824 0.6470588 0.6470588 0.6470588
[4,] 0.7882353 0.7705882 0.7686275 0.7588235 0.7588235 0.7588235 0.7549020
[5,] 0.5057915 0.5057915 0.5019455 0.5019455 0.5019455 0.5019455 0.4461538
          [,8]      [,9]     [,10]
[1,] 0.7187500 0.7187500 0.6875000
[2,] 0.8000000 0.8000000 0.8000000
[3,] 0.5882353 0.5882353 0.5294118
[4,] 0.7470588 0.7372549 0.7156863
[5,] 0.4461538 0.4418605 0.3869732


$importance
         [,1]
A1  0.2758553
A2  0.5464931
A3  0.8955025
A4  0.2191840
A5  0.2008697
A6  0.1540545
A7  0.5832975
A8  0.7176415
A9  0.1270706
A10 0.5371627

attr(,"class")
[1] "bagging"
$y.new
 [1] 0 0 0 1 0 0 1 0 0 0 0 1 1 1 0 1 0 0 0 1
Levels: 0 1

$probabilities
 [1] 0.0 0.0 0.0 1.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 1.0 1.0 1.0 0.0 1.0 0.0 0.5 0.0
[20] 1.0

$predicted.matrix
      [,1] [,2]
 [1,]    0    0
 [2,]    0    0
 [3,]    0    0
 [4,]    1    1
 [5,]    0    0
 [6,]    0    0
 [7,]    1    1
 [8,]    0    0
 [9,]    0    0
[10,]    0    0
[11,]    0    0
[12,]    1    1
[13,]    1    1
[14,]    1    1
[15,]    0    0
[16,]    1    1
[17,]    0    0
[18,]    0    1
[19,]    0    0
[20,]    1    1

attr(,"class")
[1] "BaggingPrediction"

SparseLearner documentation built on May 29, 2017, 9:18 p.m.