Bundle methods for minimization of convex and non-convex risk under L1 or L2 regularization. Implements the algorithm proposed by Teo et al. (JMLR 2010) as well as the extension proposed by Do and Artieres (JMLR 2012). The package comes with lot of loss functions for machine learning which make it powerful for big data analysis. The applications includes: structured prediction, linear SVM, multi-class SVM, f-beta optimization, ROC optimization, ordinal regression, quantile regression, epsilon insensitive regression, least mean square, logistic regression, least absolute deviation regression (see package examples), etc... all with L1 and L2 regularization.
|Date of publication||2015-01-15 16:59:17|
|Maintainer||Julien Prados <firstname.lastname@example.org>|
bmrm: Bundle Methods for Regularized Risk Minimization
costMatrix: Compute or check the structure of a cost matrix
epsilonInsensitiveRegressionLoss: The loss function to perform a epsilon-insensitive regression...
fbetaLoss: F beta score loss function
gradient: Return or set gradient attribute
hingeLoss: Hinge Loss function for SVM
ladRegressionLoss: The loss function to perform a least absolute deviation...
lmsRegressionLoss: The loss function to perform a least mean square regression
logisticRegressionLoss: The loss function to perform a logistic regression
nrbm: Convex and non-convex risk minimization with L2...
ordinalRegressionLoss: The loss function for ordinal regression
quantileRegressionLoss: The loss function to perform a quantile regression
rocLoss: The loss function to maximize area under the ROC curve
softMarginVectorLoss: Soft Margin Vector Loss function for multiclass SVM