Description Usage Arguments Details Author(s) References See Also Examples
Standard methods for computing on fvcm
objects.
1 2 3 4 5 6 7 8 9 10 11 12  ## S3 method for class 'fvcm'
oobloss(object, fun = NULL, ranef = FALSE, ...)
## S3 method for class 'fvcm'
plot(x, type = c("default", "coef",
"simple", "partdep"),
tree = NULL, ask = NULL, ...)
## S3 method for class 'fvcm'
predict(object, newdata = NULL,
type = c("link", "response", "prob", "class", "coef", "ranef"),
ranef = FALSE, na.action = na.pass, verbose = FALSE, ...)

object, x 
an object of class 
fun 
the loss function. The default loss function is defined
as the sum of the deviance residuals. For a user defined function

newdata 
an optional data frame in which to look for variables with which to predict. If omitted, the training data are used. 
type 
character string indicating the type of plot or
prediction. See 
tree 
integer vector. Which trees should be plotted. 
ask 
logical. Whether an input should be asked before printing the next panel. 
ranef 
logical scalar or matrix indicating whether predictions
should be based on random effects. See

na.action 
function determining what should be done with missing
values for fixed effects in 
verbose 
logical scalar. If 
... 
further arguments passed to other methods. 
oobloss.fvcm
estimates the outofbag loss based on
predictions of the model that aggregates only those trees in which the
observation didn't appear (cf. Hastie et al, 2001, sec. 15). The
prediction error is computed as the sum of prediction errors obtained
with fun
, which are the deviance residuals by default.
The plot and the prediction methods are analogous to
plot.tvcm
resp. predict.tvcm
. Note
that the plot options mean
and conf.int
for
type ="coef"
are not available (and internally set to
FALSE
).
Further undocumented, available methods are fitted
,
print
and ranef
. All these latter
methods have the same arguments as the corresponding default methods.
Reto Buergin
Breiman, L. (1996). Bagging Predictors. Machine Learning, 24(2), 123–140.
Breiman, L. (2001). Random Forests. Machine Learning, 45(1), 5–32.
Hastie, T., R. Tibshirani and J. Friedman (2001). The Elements of Statistical Learning (2 ed.). New York, USA: SpringerVerlag.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73  ##  #
## Dummy example 1:
##
## Fitting a random forest tvcm on artificially generated ordinal
## longitudinal data. The parameters 'maxstep = 1' and 'K = 2' are
## chosen to restrict the computations.
##  #
## load the data
data(vcrpart_1)
## fit and analyse the model
control <
fvcolmm_control(mtry = 2, maxstep = 1,
folds = folds_control(type = "subsampling", K = 2, prob = 0.75))
model.1 <
fvcolmm(y ~ 1 + wave + vc(z3, z4, by = treat, intercept = TRUE) + re(1id),
family = cumulative(), subset = 1:100,
data = vcrpart_1, control = control)
## estimating the out of bag loss
suppressWarnings(oobloss(model.1))
## predicting responses and varying coefficients for subject '27'
subs < vcrpart_1$id == "27"
## predict coefficients
predict(model.1, newdata = vcrpart_1[subs,], type = "coef")
## marginal response prediction
predict(model.1, vcrpart_1[subs,], "response", ranef = FALSE)
## conditional response prediction
re < matrix(5, 1, 1, dimnames = list("27", "(Intercept)"))
predict(model.1, vcrpart_1[subs,], "response", ranef = re)
predict(model.1, vcrpart_1[subs,], "response", ranef = 0 * re)
## predicting insample random effects
head(predict(model.1, type = "ranef"))
## fitted responses (marginal and conditional prediction)
head(predict(model.1, type = "response", ranef = FALSE))
head(predict(model.1, type = "response", ranef = TRUE))
##  #
## Dummy example 2:
##
## Fitting a random forest tvcm on artificially generated normally
## distributed data. The parameters 'maxstep = 3' and 'K = 3' are
## chosen to restrict the computations and 'minsize = 5' to obtain at
## least a few splits given the small sample size.
##  #
data(vcrpart_2)
## fit and analyse the model
control < fvcm_control(mtry = 1L, minsize = 5, maxstep = 3,
folds_control("subsampling", K = 3, 0.75))
model.2 < fvcglm(y ~ 1 + vc(z1, z2, by = x1, intercept = TRUE) + x2,
data = vcrpart_2,
family = gaussian(), subset = 1:50,control = control)
## estimating the out of bag loss
suppressWarnings(oobloss(model.2))
## predict the coefficient for individual cases
predict(model.2, vcrpart_2[91:100, ], "coef")

Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.