penalty_control: Options for penalty setup in the pre-processing

View source: R/controls.R

penalty_controlR Documentation

Options for penalty setup in the pre-processing

Description

Options for penalty setup in the pre-processing

Usage

penalty_control(
  defaultSmoothing = NULL,
  df = 10,
  null_space_penalty = FALSE,
  absorb_cons = FALSE,
  anisotropic = TRUE,
  zero_constraint_for_smooths = TRUE,
  no_linear_trend_for_smooths = FALSE,
  hat1 = FALSE,
  sp_scale = function(x) ifelse(is.list(x) | is.data.frame(x), 1/NROW(x[[1]]),
    1/NROW(x))
)

Arguments

defaultSmoothing

function applied to all s-terms, per default (NULL) the minimum df of all possible terms is used. Must be a function the smooth term from mgcv's smoothCon and an argument df.

df

degrees of freedom for all non-linear structural terms (default = 7); either one common value or a list of the same length as number of parameters; if different df values need to be assigned to different smooth terms, use df as an argument for s(), te() or ti()

null_space_penalty

logical value; if TRUE, the null space will also be penalized for smooth effects. Per default, this is equal to the value give in variational.

absorb_cons

logical; adds identifiability constraint to the basis. See ?mgcv::smoothCon for more details.

anisotropic

whether or not use anisotropic smoothing (default is TRUE)

zero_constraint_for_smooths

logical; the same as absorb_cons, but done explicitly. If true a constraint is put on each smooth to have zero mean. Can be a vector of length(list_of_formulas) for each distribution parameter.

no_linear_trend_for_smooths

logical; see zero_constraint_for_smooths, but this removes the linear trend from splines

hat1

logical; if TRUE, the smoothing parameter is defined by the trace of the hat matrix sum(diag(H)), else sum(diag(2*H-HH))

sp_scale

function of response; for scaling the penalty (1/n per default)

Value

Returns a list with options


deepregression documentation built on Jan. 18, 2023, 1:11 a.m.