sgl_fit: Fit a sparse group lasso regularization path.

Description Usage Arguments Value Author(s)

Description

A sequence of minimizers (one for each lambda given in the lambda argument) of

\mathrm{loss}(β) + λ ≤ft( (1-α) ∑_{J=1}^m γ_J \|β^{(J)}\|_2 + α ∑_{i=1}^{n} ξ_i |β_i| \right)

where \mathrm{loss} is the loss/objective function specified by module_name. The parameters are organized in the parameter matrix β with dimension q\times p. The vector β^{(J)} denotes the J parameter group. The group weights γ \in [0,∞)^m and the parameter weights ξ = (ξ^{(1)},…, ξ^{(m)}) \in [0,∞)^n with ξ^{(1)}\in [0,∞)^{n_1},…, ξ^{(m)} \in [0,∞)^{n_m}.

Usage

1
2
3
sgl_fit(module_name, PACKAGE, data, parameterGrouping, groupWeights,
  parameterWeights, alpha, lambda, return = 1:length(lambda),
  algorithm.config = sgl.standard.config)

Arguments

module_name

reference to objective specific C++ routines.

PACKAGE

name of the calling package.

data

a list of data objects – will be parsed to the specified module.

parameterGrouping

grouping of parameters, a vector of length p. Each element of the vector specifying the group of the parameters in the corresponding column of β.

groupWeights

the group weights, a vector of length length(unique(parameterGrouping)) (the number of groups).

parameterWeights

a matrix of size q \times p.

alpha

the α value 0 for group lasso, 1 for lasso, between 0 and 1 gives a sparse group lasso penalty.

lambda

the lambda sequence for the regularization path.

return

the indices of lambda values for which to return fitted parameters.

algorithm.config

the algorithm configuration to be used.

Value

beta

the fitted parameters – a list of length length(return) with each entry a matrix of size q\times (p+1) holding the fitted parameters.

loss

the values of the loss function.

objective

the values of the objective function (i.e. loss + penalty).

lambda

the lambda values used.

Author(s)

Martin Vincent


sglOptim documentation built on May 2, 2019, 5:55 p.m.

Related to sgl_fit in sglOptim...