optimize_hinge_general: Optimize a linear classifier with possibly negative...

Description Usage Arguments Value

View source: R/hinge_mcr.r

Description

The optimization uses simulated annealing (SA), with a gradient-based search at each step. (Used in our paper, but not the primary focus of this package; the default option of ignore_below_zero=TRUE may cause problems in uses of this function beyond the scope of MCR, outside of this package.)

Usage

1
2
3
4
5
optimize_hinge_general(y, X, reg_matrix = diag(p), n = length(y),
  case.weights = rep(1/n, n), reg_threshold, K = 1000, p = dim(X)[2],
  start = NULL, constr_tol = 10^-4, sparse_warning = TRUE,
  maxit_SA = 1000, extra_step_NM = TRUE, short_cg_lim = 15,
  long_cg_lim = 500, ignore_below_zero = TRUE)

Arguments

y

outcome vector with elements -1 or 1.

X

covariate matrix (n x p), which should not contain an interecept or constant column.

reg_matrix

Matrix R, where w'Rw is the penalty of a coefficient vector w.

n

sample size

case.weights

vector of numeric multipliers (possibly negative) for each observation, when computing loss summation.

reg_threshold

Value for w'Rw above which the penalty is applied

K

penalty multiplier for w'Rw

p

covariate dimension

start

starting value for coefficient vector

constr_tol

buffer for regularization, after which a penalty is applied

sparse_warning

warn user if intercept column is detected

maxit_SA

number of iterations for SA steps

extra_step_NM

whether to follow the SA search with an additional Nelder-Mead search

short_cg_lim

tuning parameter used to set initial value of the search

long_cg_lim

how many gradient-based steps to take at each SA iteration

ignore_below_zero

stop SA search if a value is discovered below zero. This is irrelevant if weights are positive, and is useful within the MCR binary search, but may cause problems for other applications of this function beyond the computation of MCR.

Value

a linear coefficient vector


aaronjfisher/mcr documentation built on Jan. 2, 2020, 4:38 p.m.