Description Usage Arguments Details Value References Examples
View source: R/Predict.bagging.R
This function makes predictions for new data from a bagging LASSO linear or logistic regression model, using the stored 'bagging' object, with or without the use of trimmed bagging strategy.
1 | Predict.bagging(object, newx, y = NULL, trimmed = FALSE, scale.trimmed = 0.75)
|
object |
a fitted 'bagging' object. |
newx |
matrix of new values for x at which predictions are to be made. Must be a matrix. See documentation for Bagging.lasso. |
y |
response variable. Defaults to NULL. If the response variable for the newx matrix is known and input, the corresponding validation measures can be calculated for evaluating prediction performance. |
trimmed |
logical. Should a trimmed bagging strategy be performed? Defaults to FALSE. This argument should correspond to the same setting in the Bagging.lasso function. See documentation for Bagging.lasso. |
scale.trimmed |
the portion to trim of the "worst" based-level models, in the sense of having the largest error rates, and to average only over the most accurate base-level models. Defaults to 0.75. |
This function makes a prediction based on the object fitted by the Bagging.lasso model.
y.new |
the predicted values of response vector y. |
probabilities |
the predicted probabilities of response vector y. |
predicted.matrix |
the matrix of predicted values of response vector y based on the base-level LASSO regression models. |
bagging.prediction |
the performance of bagging prediction accordig to the model validation measures defined. |
[1] Breiman, L. (2001). Random Forests. Machine Learning, 45(1), 5-32.
[2] Croux, C., Joossens, K., & Lemmens, A. (2007). Trimmed bagging. Computational Statistics & Data Analysis, 52(1), 362-368.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 | library(mlbench)
set.seed(0123)
mydata <- mlbench.threenorm(100, d=10)
x <- mydata$x
y <- mydata$classes
mydata <- as.data.frame(cbind(x, y))
colnames(mydata) <- c(paste("A", 1:10, sep=""), "y")
mydata$y <- ifelse(mydata$y==1, 0, 1)
# Split into training and testing data.
S1 <- as.vector(which(mydata$y==0))
S2 <- as.vector(which(mydata$y==1))
S3 <- sample(S1, ceiling(length(S1)*0.8), replace=FALSE)
S4 <- sample(S2, ceiling(length(S2)*0.8), replace=FALSE)
TrainInd <- c(S3, S4)
TestInd <- setdiff(1:length(mydata$y), TrainInd)
TrainXY <- mydata[TrainInd, ]
TestXY <- mydata[TestInd, ]
# Fit a bagging LASSO linear regression model, where the parameters
# of M in the following example is set as small values to reduce the
# running time, however the default value is proposed.
Bagging.fit <- Bagging.lasso(x=TrainXY[, -10], y=TrainXY[, 10],
family=c("gaussian"), M=2, predictor.subset=round((9/10)*ncol(x)),
predictor.importance=TRUE, trimmed=FALSE, weighted=TRUE, seed=0123)
Bagging.fit
# Make predictions from a bagging LASSO linear regression model.
pred <- Predict.bagging(Bagging.fit, newx=TestXY[, -10], y=NULL)
pred
|
Loading required package: glmnet
Loading required package: Matrix
Loading required package: foreach
Loaded glmnet 2.0-16
Iter 1
Iter 2
$family
[1] "gaussian"
$M
[1] 2
$predictor.subset
[1] 9
$subspace.size
[1] 10
$validation.metric
[1] "rmse" "mae" "re" "smape"
$boot.scale
[1] 1
$distance
[1] "Spearman"
$models.fitted
$models.fitted[[1]]
$lambda
[1] 0.334996951 0.305236745 0.278120354 0.253412908 0.230900404 0.210387849
[7] 0.191697572 0.174667688 0.159150692 0.145012183 0.132129701 0.120391662
[13] 0.109696399 0.099951273 0.091071877 0.082981303 0.075609473 0.068892535
[19] 0.062772312 0.057195793 0.052114677 0.047484952 0.043266520 0.039422842
[25] 0.035920625 0.032729536 0.029821934 0.027172636 0.024758694 0.022559199
[31] 0.020555102 0.018729044 0.017065207 0.015549181 0.014167835 0.012909203
[37] 0.011762385 0.010717447 0.009765339 0.008897813 0.008107356 0.007387121
[43] 0.006730869 0.006132917 0.005588086 0.005091656 0.004639327 0.004227182
[49] 0.003851651 0.003509481 0.003197708 0.002913633 0.002654794 0.002418949
[55] 0.002204056 0.002008254 0.001829846 0.001667288 0.001519170 0.001384212
[61] 0.001261242 0.001149197 0.001047105
$cvm
[1] 1.170612 1.171834 1.167885 1.158435 1.148482 1.139699 1.131608 1.123252
[9] 1.115103 1.106609 1.097212 1.086507 1.077951 1.070438 1.063447 1.057730
[17] 1.053037 1.049194 1.046098 1.043674 1.042377 1.042595 1.043767 1.045995
[25] 1.048606 1.051590 1.054781 1.057938 1.061084 1.064116 1.066907 1.069368
[33] 1.071647 1.073573 1.075355 1.077017 1.078578 1.080044 1.081413 1.082683
[41] 1.083853 1.084938 1.085952 1.086871 1.087723 1.088511 1.089230 1.089902
[49] 1.090525 1.091127 1.091690 1.092116 1.092455 1.092752 1.093022 1.093276
[57] 1.093510 1.093738 1.093911 1.094093 1.094257 1.094407 1.094545
$cvsd
[1] 0.1485115 0.1478064 0.1468317 0.1447677 0.1425382 0.1404797 0.1383893
[8] 0.1357820 0.1337884 0.1324965 0.1314530 0.1307501 0.1306333 0.1302056
[15] 0.1293205 0.1287382 0.1283754 0.1281913 0.1281387 0.1281642 0.1280273
[22] 0.1277982 0.1276712 0.1275375 0.1274735 0.1274279 0.1274987 0.1276283
[29] 0.1277789 0.1279861 0.1281145 0.1280807 0.1280520 0.1280708 0.1280963
[36] 0.1281242 0.1281533 0.1281877 0.1282255 0.1282668 0.1283050 0.1283450
[43] 0.1283894 0.1284260 0.1284595 0.1284920 0.1285241 0.1285561 0.1285885
[50] 0.1286352 0.1286842 0.1287594 0.1288378 0.1289120 0.1289809 0.1290423
[57] 0.1290980 0.1291455 0.1291952 0.1292374 0.1292775 0.1293135 0.1293460
$cvup
[1] 1.319123 1.319641 1.314717 1.303202 1.291021 1.280179 1.269998 1.259034
[9] 1.248892 1.239105 1.228665 1.217257 1.208585 1.200644 1.192767 1.186468
[17] 1.181412 1.177385 1.174237 1.171838 1.170405 1.170394 1.171439 1.173532
[25] 1.176079 1.179018 1.182280 1.185566 1.188862 1.192102 1.195021 1.197449
[33] 1.199698 1.201644 1.203451 1.205141 1.206732 1.208232 1.209639 1.210950
[41] 1.212158 1.213283 1.214341 1.215297 1.216183 1.217003 1.217755 1.218458
[49] 1.219113 1.219762 1.220375 1.220876 1.221293 1.221664 1.222003 1.222318
[57] 1.222608 1.222883 1.223107 1.223330 1.223535 1.223721 1.223891
$cvlo
[1] 1.0221003 1.0240278 1.0210534 1.0136670 1.0059441 0.9992192 0.9932191
[8] 0.9874701 0.9813149 0.9741120 0.9657590 0.9557566 0.9473180 0.9402323
[15] 0.9341263 0.9289920 0.9246613 0.9210022 0.9179595 0.9155100 0.9143500
[22] 0.9147973 0.9160961 0.9184573 0.9211321 0.9241623 0.9272827 0.9303098
[29] 0.9333047 0.9361300 0.9387920 0.9412873 0.9435945 0.9455026 0.9472584
[36] 0.9488926 0.9504252 0.9518561 0.9531877 0.9544162 0.9555482 0.9565934
[43] 0.9575622 0.9584446 0.9592639 0.9600192 0.9607064 0.9613463 0.9619363
[50] 0.9624916 0.9630061 0.9633570 0.9636175 0.9638398 0.9640413 0.9642338
[57] 0.9644121 0.9645924 0.9647161 0.9648556 0.9649797 0.9650940 0.9651992
$nzero
s0 s1 s2 s3 s4 s5 s6 s7 s8 s9 s10 s11 s12 s13 s14 s15 s16 s17 s18 s19
0 3 3 3 3 3 3 4 4 5 5 5 5 5 5 5 5 5 5 5
s20 s21 s22 s23 s24 s25 s26 s27 s28 s29 s30 s31 s32 s33 s34 s35 s36 s37 s38 s39
5 6 6 6 6 6 7 7 7 7 7 7 7 7 7 7 8 8 8 8
s40 s41 s42 s43 s44 s45 s46 s47 s48 s49 s50 s51 s52 s53 s54 s55 s56 s57 s58 s59
9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9
s60 s61 s62
9 9 9
$name
mse
"Mean-Squared Error"
$glmnet.fit
Call: glmnet(x = as.matrix(training_1), y = trainY, family = "gaussian")
Df %Dev Lambda
[1,] 0 0.00000 0.3350000
[2,] 3 0.01892 0.3052000
[3,] 3 0.05056 0.2781000
[4,] 3 0.07684 0.2534000
[5,] 3 0.09865 0.2309000
[6,] 3 0.11680 0.2104000
[7,] 3 0.13180 0.1917000
[8,] 4 0.14530 0.1747000
[9,] 4 0.15670 0.1592000
[10,] 5 0.17030 0.1450000
[11,] 5 0.18290 0.1321000
[12,] 5 0.19330 0.1204000
[13,] 5 0.20200 0.1097000
[14,] 5 0.20920 0.0999500
[15,] 5 0.21510 0.0910700
[16,] 5 0.22010 0.0829800
[17,] 5 0.22420 0.0756100
[18,] 5 0.22760 0.0688900
[19,] 5 0.23050 0.0627700
[20,] 5 0.23280 0.0572000
[21,] 5 0.23480 0.0521100
[22,] 6 0.23650 0.0474800
[23,] 6 0.23800 0.0432700
[24,] 6 0.23920 0.0394200
[25,] 6 0.24030 0.0359200
[26,] 6 0.24110 0.0327300
[27,] 7 0.24240 0.0298200
[28,] 7 0.24400 0.0271700
[29,] 7 0.24530 0.0247600
[30,] 7 0.24640 0.0225600
[31,] 7 0.24730 0.0205600
[32,] 7 0.24800 0.0187300
[33,] 7 0.24860 0.0170700
[34,] 7 0.24910 0.0155500
[35,] 7 0.24950 0.0141700
[36,] 7 0.24990 0.0129100
[37,] 8 0.25020 0.0117600
[38,] 8 0.25050 0.0107200
[39,] 8 0.25080 0.0097650
[40,] 8 0.25100 0.0088980
[41,] 9 0.25120 0.0081070
[42,] 9 0.25130 0.0073870
[43,] 9 0.25140 0.0067310
[44,] 9 0.25150 0.0061330
[45,] 9 0.25160 0.0055880
[46,] 9 0.25170 0.0050920
[47,] 9 0.25170 0.0046390
[48,] 9 0.25180 0.0042270
[49,] 9 0.25180 0.0038520
[50,] 9 0.25190 0.0035090
[51,] 9 0.25190 0.0031980
[52,] 9 0.25190 0.0029140
[53,] 9 0.25190 0.0026550
[54,] 9 0.25190 0.0024190
[55,] 9 0.25200 0.0022040
[56,] 9 0.25200 0.0020080
[57,] 9 0.25200 0.0018300
[58,] 9 0.25200 0.0016670
[59,] 9 0.25200 0.0015190
[60,] 9 0.25200 0.0013840
[61,] 9 0.25200 0.0012610
[62,] 9 0.25200 0.0011490
[63,] 9 0.25200 0.0010470
[64,] 9 0.25200 0.0009541
[65,] 9 0.25200 0.0008693
$lambda.min
[1] 0.05211468
$lambda.1se
[1] 0.2781204
attr(,"class")
[1] "cv.glmnet"
$models.fitted[[2]]
$lambda
[1] 0.2576272052 0.2347403143 0.2138866318 0.1948855329 0.1775724393
[6] 0.1617973932 0.1474237588 0.1343270384 0.1223937945 0.1115206672
[11] 0.1016134785 0.0925864171 0.0843612950 0.0768668701 0.0700382292
[16] 0.0638162259 0.0581469683 0.0529813519 0.0482746346 0.0439860492
[21] 0.0400784498 0.0365179908 0.0332738331 0.0303178775 0.0276245208
[26] 0.0251704345 0.0229343625 0.0208969369 0.0190405107 0.0173490042
[31] 0.0158077665 0.0144034481 0.0131238854 0.0119579956 0.0108956802
[36] 0.0099277380 0.0090457852 0.0082421825 0.0075099698 0.0068428047
[41] 0.0062349088 0.0056810167 0.0051763308 0.0047164799 0.0042974808
[46] 0.0039157045 0.0035678441 0.0032508866 0.0029620868 0.0026989432
[51] 0.0024591765 0.0022407101 0.0020416516 0.0018602769 0.0016950150
[56] 0.0015444345 0.0014072312 0.0012822167 0.0011683081 0.0010645188
[61] 0.0009699499 0.0008837822
$cvm
[1] 1.0193890 1.0206737 1.0187454 1.0128485 1.0063380 1.0000077 0.9941295
[8] 0.9879911 0.9816970 0.9762128 0.9721782 0.9699116 0.9688261 0.9686737
[15] 0.9697535 0.9727381 0.9767684 0.9809112 0.9854143 0.9900350 0.9947992
[22] 1.0004479 1.0064029 1.0126150 1.0186856 1.0244149 1.0297851 1.0345482
[29] 1.0388061 1.0428233 1.0463685 1.0495586 1.0524386 1.0550940 1.0575701
[36] 1.0598734 1.0619954 1.0639591 1.0657745 1.0674508 1.0689375 1.0702232
[43] 1.0714150 1.0725131 1.0735200 1.0744367 1.0752916 1.0760645 1.0767596
[50] 1.0774099 1.0780050 1.0785157 1.0789579 1.0793509 1.0797238 1.0800779
[57] 1.0803763 1.0806615 1.0809128 1.0811314 1.0813496 1.0815404
$cvsd
[1] 0.1169343 0.1173638 0.1173544 0.1184963 0.1192615 0.1190444 0.1183406
[8] 0.1173577 0.1162086 0.1151823 0.1142939 0.1134235 0.1127108 0.1120990
[15] 0.1115532 0.1109913 0.1104222 0.1100619 0.1099154 0.1099646 0.1102156
[22] 0.1110962 0.1120843 0.1131655 0.1142901 0.1154163 0.1165235 0.1174469
[29] 0.1182046 0.1188957 0.1195249 0.1201614 0.1207885 0.1213875 0.1219495
[36] 0.1224758 0.1229673 0.1234225 0.1238473 0.1242468 0.1246533 0.1250727
[43] 0.1254564 0.1258077 0.1261303 0.1264251 0.1266933 0.1269410 0.1271661
[50] 0.1273753 0.1275632 0.1277191 0.1278522 0.1279684 0.1280796 0.1281825
[57] 0.1282761 0.1283594 0.1284387 0.1285117 0.1285799 0.1286414
$cvup
[1] 1.136323 1.138037 1.136100 1.131345 1.125599 1.119052 1.112470 1.105349
[9] 1.097906 1.091395 1.086472 1.083335 1.081537 1.080773 1.081307 1.083729
[17] 1.087191 1.090973 1.095330 1.100000 1.105015 1.111544 1.118487 1.125781
[25] 1.132976 1.139831 1.146309 1.151995 1.157011 1.161719 1.165893 1.169720
[33] 1.173227 1.176481 1.179520 1.182349 1.184963 1.187382 1.189622 1.191698
[41] 1.193591 1.195296 1.196871 1.198321 1.199650 1.200862 1.201985 1.203006
[49] 1.203926 1.204785 1.205568 1.206235 1.206810 1.207319 1.207803 1.208260
[57] 1.208652 1.209021 1.209351 1.209643 1.209929 1.210182
$cvlo
[1] 0.9024546 0.9033099 0.9013909 0.8943522 0.8870765 0.8809633 0.8757889
[8] 0.8706334 0.8654883 0.8610305 0.8578843 0.8564881 0.8561153 0.8565747
[15] 0.8582003 0.8617468 0.8663461 0.8708493 0.8754989 0.8800704 0.8845836
[22] 0.8893517 0.8943186 0.8994495 0.9043955 0.9089986 0.9132616 0.9171012
[29] 0.9206015 0.9239276 0.9268436 0.9293973 0.9316500 0.9337065 0.9356206
[36] 0.9373976 0.9390280 0.9405366 0.9419272 0.9432040 0.9442842 0.9451505
[43] 0.9459586 0.9467054 0.9473897 0.9480116 0.9485984 0.9491234 0.9495936
[50] 0.9500346 0.9504418 0.9507966 0.9511057 0.9513826 0.9516442 0.9518954
[57] 0.9521002 0.9523021 0.9524741 0.9526196 0.9527697 0.9528990
$nzero
s0 s1 s2 s3 s4 s5 s6 s7 s8 s9 s10 s11 s12 s13 s14 s15 s16 s17 s18 s19
0 2 2 3 4 4 4 4 4 5 5 5 5 5 6 6 6 6 6 6
s20 s21 s22 s23 s24 s25 s26 s27 s28 s29 s30 s31 s32 s33 s34 s35 s36 s37 s38 s39
6 6 6 6 6 7 8 8 8 8 9 9 9 9 9 9 9 9 9 9
s40 s41 s42 s43 s44 s45 s46 s47 s48 s49 s50 s51 s52 s53 s54 s55 s56 s57 s58 s59
9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9
s60 s61
9 9
$name
mse
"Mean-Squared Error"
$glmnet.fit
Call: glmnet(x = as.matrix(training_1), y = trainY, family = "gaussian")
Df %Dev Lambda
[1,] 0 0.00000 0.2576000
[2,] 2 0.01339 0.2347000
[3,] 2 0.03147 0.2139000
[4,] 3 0.04921 0.1949000
[5,] 4 0.06632 0.1776000
[6,] 4 0.08253 0.1618000
[7,] 4 0.09599 0.1474000
[8,] 4 0.10720 0.1343000
[9,] 4 0.11640 0.1224000
[10,] 5 0.12440 0.1115000
[11,] 5 0.13130 0.1016000
[12,] 5 0.13710 0.0925900
[13,] 5 0.14190 0.0843600
[14,] 5 0.14580 0.0768700
[15,] 6 0.15060 0.0700400
[16,] 6 0.15470 0.0638200
[17,] 6 0.15810 0.0581500
[18,] 6 0.16090 0.0529800
[19,] 6 0.16320 0.0482700
[20,] 6 0.16520 0.0439900
[21,] 6 0.16680 0.0400800
[22,] 6 0.16810 0.0365200
[23,] 6 0.16920 0.0332700
[24,] 6 0.17010 0.0303200
[25,] 6 0.17090 0.0276200
[26,] 7 0.17210 0.0251700
[27,] 8 0.17360 0.0229300
[28,] 8 0.17500 0.0209000
[29,] 8 0.17610 0.0190400
[30,] 8 0.17710 0.0173500
[31,] 9 0.17780 0.0158100
[32,] 9 0.17850 0.0144000
[33,] 9 0.17910 0.0131200
[34,] 9 0.17960 0.0119600
[35,] 9 0.18000 0.0109000
[36,] 9 0.18030 0.0099280
[37,] 9 0.18060 0.0090460
[38,] 9 0.18080 0.0082420
[39,] 9 0.18100 0.0075100
[40,] 9 0.18120 0.0068430
[41,] 9 0.18130 0.0062350
[42,] 9 0.18140 0.0056810
[43,] 9 0.18150 0.0051760
[44,] 9 0.18160 0.0047160
[45,] 9 0.18160 0.0042970
[46,] 9 0.18170 0.0039160
[47,] 9 0.18170 0.0035680
[48,] 9 0.18180 0.0032510
[49,] 9 0.18180 0.0029620
[50,] 9 0.18180 0.0026990
[51,] 9 0.18190 0.0024590
[52,] 9 0.18190 0.0022410
[53,] 9 0.18190 0.0020420
[54,] 9 0.18190 0.0018600
[55,] 9 0.18190 0.0016950
[56,] 9 0.18190 0.0015440
[57,] 9 0.18190 0.0014070
[58,] 9 0.18190 0.0012820
[59,] 9 0.18190 0.0011680
[60,] 9 0.18190 0.0010650
[61,] 9 0.18190 0.0009699
[62,] 9 0.18190 0.0008838
[63,] 9 0.18190 0.0008053
[64,] 9 0.18190 0.0007337
[65,] 9 0.18190 0.0006685
$lambda.min
[1] 0.07686687
$lambda.1se
[1] 0.2576272
attr(,"class")
[1] "cv.glmnet"
$models.trimmed
list()
$y.true
[1] -0.83775479 -1.14851936 0.33924804 -1.43829554 -1.18967101 -0.41107042
[7] 0.25222901 1.59426588 -2.06595813 1.02968473 -0.95343363 1.01458714
[13] 0.28231774 -1.68147706 0.57755498 0.63981125 -0.71844715 -0.78385149
[19] 1.12153585 1.41082054 2.31689124 0.05806684 -0.83275425 -0.19727404
[25] 0.75305102 -0.88536477 1.04273063 0.09621294 0.56114745 -0.01265842
[31] 1.08066531 -0.73433879 -1.15332488 -2.51413424 -1.83666523 -0.37910482
[37] -0.71644425 0.72703906 0.15528332 0.26264430 -0.25781196 -1.84640000
[43] -0.60635531 -1.48151667 -0.20135660 -2.24649499 -0.48603835 -0.37209404
[49] 0.08876528 1.03353537 -1.21352291 0.01505783 -1.94896593 -0.66214950
[55] -0.33336863 -0.22916520 -1.26203427 -1.32576006 -0.91152770 -0.88164621
[61] 0.99116568 -1.07941484 -2.71094480 -2.07449018 -0.23715966 -0.79074956
[67] 0.35391033 0.91415362 -0.51173620 0.73567712 -1.28623536 0.32291011
[73] 1.45426190 -0.82834478 0.37822243 -0.38892281 -1.84089825 0.11562563
[79] 0.38670154 0.35751608
$conv.scores
$conv.scores$ranks
[,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10]
[1,] "5" "10" "8" "7" "4" "3" "2" "1" "9" "6"
[2,] "5" "10" "8" "7" "4" "3" "2" "1" "9" "6"
[3,] "1" "9" "6" "3" "2" "4" "7" "8" "10" "5"
[4,] "5" "10" "4" "7" "8" "2" "3" "9" "6" "1"
$conv.scores$weights
[,1] [,2] [,3] [,4] [,5] [,6] [,7]
[1,] 0.9770869 0.9770843 0.9747954 0.9747930 0.9697949 0.9658363 0.9653150
[2,] 1.2499899 1.2499870 1.2460446 1.2460409 1.2393659 1.2319513 1.2314571
[3,] 0.7616848 0.7609513 0.7375416 0.7373028 0.7370965 0.7358816 0.7355996
[4,] 0.7672880 0.7672870 0.7666914 0.7659642 0.7659635 0.7627853 0.7625705
[,8] [,9] [,10]
[1,] 0.9618191 0.9606570 0.9594676
[2,] 1.2300032 1.2289072 1.2220636
[3,] 0.7355948 0.7346079 0.7346060
[4,] 0.7578440 0.7577510 0.7568872
$importance
[,1]
A1 0.12302406
A2 0.00000000
A3 0.06926666
A4 0.06271640
A5 0.13859420
A6 0.06824081
A7 0.04522673
A8 0.07795560
A9 0.00000000
y 0.23927699
attr(,"class")
[1] "bagging"
$y.new
[1] 0.04447873 -0.12871182 -0.23723027 -0.02300019 -0.03851816 -0.44884138
[7] -0.80601068 -0.71891720 0.03472918 0.05418729 -0.06476798 -0.17292143
[13] -0.61339668 -0.26723470 -0.28321384 -0.67176706 -0.51268009 -0.61530304
[19] -0.51708935 -0.81701762
$y.se
[1] 0.266523429 0.169630890 0.055204841 0.308323909 0.044965845 0.313516781
[7] 0.343833731 0.129416122 0.036851445 0.153187885 0.219717119 0.004629453
[13] 0.015876657 0.015483059 0.138680946 0.079378230 0.062891881 0.051941293
[19] 0.325012999 0.293268186
$predicted.matrix
[,1] [,2]
[1,] 0.311002159 -0.222044699
[2,] 0.040919067 -0.298342714
[3,] -0.292435110 -0.182025429
[4,] 0.285323717 -0.331324101
[5,] -0.083484002 0.006447688
[6,] -0.762358161 -0.135324600
[7,] -1.149844416 -0.462176953
[8,] -0.848333322 -0.589501078
[9,] -0.002122263 0.071580627
[10,] -0.099000596 0.207375173
[11,] -0.284485102 0.154949136
[12,] -0.168291977 -0.177550882
[13,] -0.597520023 -0.629273337
[14,] -0.282717763 -0.251751645
[15,] -0.421894781 -0.144532890
[16,] -0.592388826 -0.751145286
[17,] -0.575571974 -0.449788213
[18,] -0.667244329 -0.563361743
[19,] -0.842102349 -0.192076351
[20,] -1.110285808 -0.523749436
attr(,"class")
[1] "BaggingPrediction"
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.