Description Usage Arguments Value References Examples
Do prediction using sequential bagging method with tree based learning algorithm
1 2 | SQBreg(data.train, data.test, y, res, reps, cores, FunKDE, control,
SQBalgorithm.1, SQBalgorithm.2, k, ncomp, nnet.size)
|
data.train |
Training dataset |
data.test |
Testing dataset |
y |
Numeric response variable |
res |
Resampling size. Could not be greater than the input data size. |
reps |
Replicates for the first bagging, default 100 |
cores |
Use multi-cores, default one core, use cores='maxcores' for full use. |
FunKDE |
Kernel density estimate function. Use different kernel to fit, default logistic kernel. |
control |
Use in rpart package, rpart.control to tune the control |
SQBalgorithm.1 |
Use for the initial training. Option: CART, lm(default), knnreg, nnet, PCR. |
SQBalgorithm.2 |
Use for the last training. Option: CART, lm(default), knnreg, nnet, PCR. |
k |
The number of nearest neighbour used for knnreg |
ncomp |
The number of component used for PCR |
nnet.size |
The number of hidden layer and neuron for nnet |
Given testing set input, make a regression prediction
Breiman L., Friedman J. H., Olshen R. A., and Stone, C. J. (1984) Classification and Regression Trees.
Soleymani, M. and Lee S.M.S(2014). Sequential combination of weighted and nonparametric bagging for classification. Biometrika, 101, 2, pp. 491–498.
Efron, B. (1979). Bootstrap methods: Another lo ok at the jackknife. Ann. Statist., 7(1):1-26.
1 2 3 4 5 6 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.