This class sets up the base learning models and respective parameters setting to learn the ensemble.
learner
character vector with the base learners to be trained. Currently available models are:
Gaussian Process models, from the
kernlab package. See gausspr
for a complete description and possible parametrization. See
bm_gaussianprocess
for the function implementation.
Projection Pursuit Regression models, from the
stats package. See ppr
for a complete description and possible parametrization. See
bm_ppr
for the function implementation.
Generalized Linear Models, from the
glmnet package. See glmnet
for a complete description and possible parametrization. See
bm_glm
for the function implementation.
Generalized Boosted Regression models, from the
gbm package. See gbm
for a complete description and possible parametrization. See
bm_gbm
for the function implementation.
Random Forest models, from the
ranger package. See ranger
for a complete description and possible parametrization. See
bm_randomforest
for the function implementation.
M5 tree models, from the
Cubist package. See cubist
for a complete description and possible parametrization. See
bm_cubist
for the function implementation.
Multivariate Adaptive Regression Splines models, from the
earth package. See earth
for a complete description and possible parametrization. See
bm_mars
for the function implementation.
Support Vector Regression models, from the
kernlab package. See ksvm
for a complete description and possible parametrization. See
bm_svr
for the function implementation.
Feedforward Neural Network models, from the
nnet package. See nnet
for a complete description and possible parametrization. See
bm_ffnn
for the function implementation.
Partial Least Regression and Principal
Component Regression models, from the pls package.
See mvr
for a complete description
and possible parametrization. See bm_pls_pcr
for the function implementation.
learner_pars
a list with parameter setting for the learner. For each model, a inner list should be created with the specified parameters.
Check each implementation to see the possible variations of parameters (also examplified below).
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 | # A PPR model and a GLM model with default parameters
model_specs(learner = c("bm_ppr", "bm_glm"), learner_pars = NULL)
# A PPR model and a SVR model. The listed parameters are combined
# with a cartesian product.
# With these specifications an ensemble with 6 predictive base
# models will be created. Two PPR models, one with 2 nterms
# and another with 4; and 4 SVR models, combining the kernel
# and C parameters.
specs <- model_specs(
c("bm_ppr", "bm_svr"),
list(bm_ppr = list(nterms = c(2, 4)),
bm_svr = list(kernel = c("vanilladot", "polydot"), C = c(1,5)))
)
# All parameters currently available (parameter values can differ)
model_specs(
learner = c("bm_ppr", "bm_svr", "bm_randomforest",
"bm_gaussianprocess", "bm_cubist", "bm_glm",
"bm_gbm", "bm_pls_pcr", "bm_ffnn", "bm_mars"
),
learner_pars = list(
bm_ppr = list(
nterms = c(2,4),
sm.method = "supsmu"
),
bm_svr = list(
kernel = "rbfdot",
C = c(1,5),
epsilon = .01
),
bm_glm = list(
alpha = c(1, 0)
),
bm_randomforest = list(
num.trees = 500
),
bm_gbm = list(
interaction.depth = 1,
shrinkage = c(.01, .005),
n.trees = c(100)
),
bm_mars = list(
nk = 15,
degree = 3,
thresh = .001
),
bm_ffnn = list(
size = 30,
decay = .01
),
bm_pls_pcr = list(
method = c("kernelpls", "simpls", "cppls")
),
bm_gaussianprocess = list(
kernel = "vanilladot",
tol = .01
),
bm_cubist = list(
committees = 50,
neighbors = 0
)
)
)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.