ForwardBoost: Computation of the ForwardBoost Algorithm

Description Usage Arguments Details Value Author(s) References See Also

View source: R/ForwardBoost.R

Description

This function fits a GLM based on penalized likelihood inference by the ForwardBoost algorithm. However, it is primarily intended for internal use. You can access it via the argument setting method = "ForwardBoost" in lqa, cv.lqa or plot.lqa.

Usage

1
2
3
   ForwardBoost (x, y, family = NULL, penalty = NULL, intercept = 
        TRUE, weights = rep (1, nobs), control = lqa.control (), 
        nu = 1, monotonic = TRUE, ...)

Arguments

x

matrix of standardized regressors. This matrix does not need to include a first column of ones when a GLM with intercept is to be fitted.

y

vector of observed response values.

family

a description of the error distribution and link function to be used in the model. This can be a character string naming a family function, a family function or the result of a call to a family function. See family() for further details.

penalty

a description of the penalty to be used in the fitting procedure, e.g. penalty = lasso (lambda = 1.7).

intercept

a logical object indicating whether the model should include an intercept (this is recommended) or not. The default value is intercept = TRUE.

weights

some additional weights for the observations.

control

a list of parameters for controlling the fitting process. See lqa.control.

nu

parameter ν from the ForwardBoost algorithm. This parameter manages the update step length. See Ulbricht (2010).

monotonic

a logical variable. If TRUE then the number of active regressors increases monotonically during the iterations. This is in line with the ForwardBoost algorithm and hence recommended.

...

further arguments.

Details

The ForwardBoost algorithm has been described in Ulbricht (2010). So see there for a more detailed technical description.

Value

GBlockBoost returns a list containing the following elements:

coefficients

the vector of standardized estimated coefficients.

beta.mat

matrix containing the estimated coefficients from all iterations (rowwise).

m.stop

the number of iterations until AIC reaches its minimum.

stop.at

the number of iterations until convergence.

aic.vec

vector of AIC criterion through all iterations.

bic.vec

vector of BIC criterion through all iterations.

converged

a logical variable. This will be TRUE if the algorithm has indeed converged.

min.aic

minimum value of AIC criterion.

min.bic

minimum value of BIC criterion.

tr.H

the trace of the hat matrix.

tr.Hatmat

vector of hat matrix traces through all iterations.

dev.m

vector of deviances through all iterations.

Author(s)

Jan Ulbricht

References

Ulbricht, Jan (2010) Variable Selection in Generalized Linear Models. Ph.D. Thesis. LMU Munich.

See Also

lqa, GBlockBoost


lqa documentation built on May 30, 2017, 3:41 a.m.