qrnn.cost: Smooth approximation to the tilted absolute value cost...

View source: R/qrnn.R

qrnn.costR Documentation

Smooth approximation to the tilted absolute value cost function

Description

Smooth approximation to the tilted absolute value cost function used to fit a QRNN model. Optional left censoring, monotone constraints, and additive constraints are supported.

Usage

qrnn.cost(weights, x, y, n.hidden, w, tau, lower, monotone,
          additive, eps, Th, Th.prime, penalty, unpenalized)

Arguments

weights

weight vector of length returned by qrnn.initialize.

x

covariate matrix with number of rows equal to the number of samples and number of columns equal to the number of variables.

y

response column matrix with number of rows equal to the number of samples.

n.hidden

number of hidden nodes in the QRNN model.

w

vector of weights with length equal to the number of samples; NULL gives equal weight to each sample.

tau

desired tau-quantile.

lower

left censoring point.

monotone

column indices of covariates for which the monotonicity constraint should hold.

additive

force additive relationships.

eps

epsilon value used in the approximation functions.

Th

hidden layer transfer function; use sigmoid, elu, relu, lrelu, softplus, or other non-decreasing function for a nonlinear model and linear for a linear model.

Th.prime

derivative of the hidden layer transfer function Th.

penalty

weight penalty for weight decay regularization.

unpenalized

column indices of covariates for which the weight penalty should not be applied to input-hidden layer weights.

Value

numeric value indicating tilted absolute value cost function, along with attribute containing vector with gradient information.

See Also

qrnn.fit


qrnn documentation built on May 29, 2024, 1:27 a.m.