Control function to minQuad
1 2 3 4 5 6 7 8 9 10 11 12 13 14 
rank 
a nonnegative integer indicating the 'rank' of the design matrix, only used by by method 'exhaustive' or 'x'. if zero it is estimated by the singular value decomposition of each 'submatrix' associated with the suboptimization problem in the decomposition method. 
method 
a character string (first letter is sufficient) indicating which quadratic optmizer to use, defaults to 'default'. See details. 
optim.control 
a list of control parameters to methods 'tron' or 'loqo'; 
q 
size of the working set, will be set to 2 for all methods except for method = 'tron' when it defaults to NULL. In that case workings set size is automatically chosen to be sqrt(#violators) at each iteration. 
ws 
a character string indicating the strategy of how to select the working set, defaults to "rv2wg", see details. 
maxit 
maximum number of iterations whose 
tol 
tolerance for termination criterion whose 
DUP 
should arguments be passed by reference ? defaults to FALSE. 
NAOK 
should NA's,NaN's be allowed to be passed to C code (no checking) ? defaults to FALSE. 
verbose 
some output at each iteration, possible values are FALSE/TRUE or and integer if more details are wanted, defaults to FALSE. 
ret.ws 
defaults to FALSE, indicates whether to return the working set selected at each iteration. 
ret.data 
defaults to FALSE, indicates whether to return the data passed to minQuad. 
Four quadratic optimizers are available within minQuad
, "default", "tron", "loqo" and "exhaustive" (optimizer 'x' is
a slightly faster implementation of the exhaustive method). For working set size q = 2, the 'default' option is a fast implementation that
loosely minimizes the quadratic objective function, which is often sufficient to achieve convergence in the DCAloop in rauc
.
For working set size q = 2, the 'default' option minimizes the quadratic objective function by "solving" an associated equation at each data point.
The "exhaustive" method is a bruteforce method that gives an exact solution to each quadratic subproblem in minQuad
and should probably not be used beyond working set size q = 8,10 on most computers. Method 'tron' is a positive semidefinite
quadratic optimizer and thus well suited for lowrank problems  for this method 'q' can be larger, ~100 or perhaps even ~1000.
Method 'loqo' is a positive definite quadratic optimizer that accepts 'm' constraints specified by (m x n) matrix A in the form v <= A*x <= v+r with both
v and r finite.
The default value of the size of the working set 'q' is 0. This means that if 'method' is 'tron' then
'q' is automatically set to the sqrt(no. of violators) at the current iteration in minQuad
(rounded up). Otherwise 'q'
defaults to 2 but may be set to any nonzero integer that is greater than 1.
The "ws" argument sets the type of strategy to select the working set.
Denote the two sets of violators as V0 = {1 if (a[p] > 0.0),(df[p] > 0.0), 0 ow.}, VC = {1 if (a[p] < C),(df[p] < 0.0), 0 ow.}
where "df[a]" stands for the gradient of the objective function at 'a'.
"greedy" selects the extremes pairs (max,min) from the two sets (M,m) where M = {df[i] , a[i] < C} and m = {df[j]  a[j] > 0}.
"v" selects from violators V = V0 U VC ranked by df.
"v2" selects separately from V0 and VC separately, ranked by df.
"rv" selects without replacement (WOR) from all violators.
"rvwg" selects WOR from all violators V with probability ~ df[V].
"rv2wg" selects WOR from the two sets of violators V0 and VC with probability ~ df[V].
A list with the following elements:
convergence 
0 if converged, 1 if maximum iteration is reached. 
alpha 
estimated vector of coefficients. 
value 
value of the objective function. 
iterations 
number of iterations until convergence or "maxit" reached. 
epsilon 
stopping rule value to be compared to "tol". 
n 

nSV 
#{0 < a}, no. of support vectors. 
nBSV 
#{a==C}, no. of bounded support vectors. 
nFSV 
#{0 < alpha < C}, no. of unbounded support vectors. 
control 
the control argument. 
ws 
if requested, the working set selected at current iteration. 
Krisztian Sebestyen ksebestyen@gmail.com
Youyi Fong youyifong@gmail.com
Shuxin Yin
Combining Biomarkers Nonlinearly for Classification Using the Area Under the ROC Curve Y. FONG, S. YIN, Y. HUANG Biometrika (2012), pp 128
Newton's method for large boundconstrained optimization problems Lin, C.J. and More, J.J. SIAM Journal on Optimization (1999), volume 9, pp 11001127.
kernlab  An S4 Package for Kernel Methods in R. Alexandros Karatzoglou, Alex Smola, Kurt Hornik, Achim Zeileis Journal of Statistical Software (2004) 11(9), 120. URL http://www.jstatsoft.org/v11/i09/
Questions? Problems? Suggestions? Tweet to @rdrrHQ or email at ian@mutexlabs.com.
Please suggest features or report bugs with the GitHub issue tracker.
All documentation is copyright its authors; we didn't write any of that.