| svmlin | R Documentation |
R interface to the svmlin code by Vikas Sindhwani and S. Sathiya Keerthi for fast linear transductive SVMs.
svmlin(X, y, X_u = NULL, algorithm = 1, lambda = 1, lambda_u = 1,
max_switch = 10000, pos_frac = 0.5, Cp = 1, Cn = 1,
verbose = FALSE, intercept = TRUE, scale = FALSE, x_center = FALSE)
X |
Matrix or sparseMatrix containing the labeled feature vectors, without intercept |
y |
factor containing class assignments |
X_u |
Matrix or sparseMatrix containing the unlabeled feature vectors, without intercept |
algorithm |
integer; Algorithm choice, see details (default:1) |
lambda |
double; Regularization parameter lambda (default 1) |
lambda_u |
double; Regularization parameter lambda_u (default 1) |
max_switch |
integer; Maximum number of switches in TSVM (default 10000) |
pos_frac |
double; Positive class fraction of unlabeled data (default 0.5) |
Cp |
double; Relative cost for positive examples (only available with algorithm 1) |
Cn |
double; Relative cost for positive examples (only available with algorithm 1) |
verbose |
logical; Controls the verbosity of the output |
intercept |
logical; Whether an intercept should be included |
scale |
logical; Should the features be normalized? (default: FALSE) |
x_center |
logical; Should the features be centered? |
The codes to select the algorithm are the following: 0. Regularized Least Squares Classification 1. SVM (L2-SVM-MFN) 2. Multi-switch Transductive SVM (using L2-SVM-MFN) 3. Deterministic Annealing Semi-supervised SVM (using L2-SVM-MFN).
Vikas Sindhwani and S. Sathiya Keerthi. Large Scale Semi-supervised Linear SVMs. Proceedings of ACM SIGIR, 2006 @references V. Sindhwani and S. Sathiya Keerthi. Newton Methods for Fast Solution of Semi-supervised Linear SVMs. Book Chapter in Large Scale Kernel Machines, MIT Press, 2006
Other RSSL classifiers:
EMLeastSquaresClassifier,
EMLinearDiscriminantClassifier,
GRFClassifier,
ICLeastSquaresClassifier,
ICLinearDiscriminantClassifier,
KernelLeastSquaresClassifier,
LaplacianKernelLeastSquaresClassifier(),
LaplacianSVM,
LeastSquaresClassifier,
LinearDiscriminantClassifier,
LinearSVM,
LinearTSVM(),
LogisticLossClassifier,
LogisticRegression,
MCLinearDiscriminantClassifier,
MCNearestMeanClassifier,
MCPLDA,
MajorityClassClassifier,
NearestMeanClassifier,
QuadraticDiscriminantClassifier,
S4VM,
SVM,
SelfLearning,
TSVM,
USMLeastSquaresClassifier,
WellSVM
data(svmlin_example)
t_svmlin_1 <- svmlin(svmlin_example$X_train[1:50,],
svmlin_example$y_train,X_u=NULL, lambda = 0.001)
t_svmlin_2 <- svmlin(svmlin_example$X_train[1:50,],
svmlin_example$y_train,
X_u=svmlin_example$X_train[-c(1:50),],
lambda = 10,lambda_u=100,algorithm = 2)
# Calculate Accuracy
mean(predict(t_svmlin_1,svmlin_example$X_test)==svmlin_example$y_test)
mean(predict(t_svmlin_2,svmlin_example$X_test)==svmlin_example$y_test)
data(testdata)
g_svm <- SVM(testdata$X,testdata$y)
g_sup <- svmlin(testdata$X,testdata$y,testdata$X_u,algorithm = 3)
g_semi <- svmlin(testdata$X,testdata$y,testdata$X_u,algorithm = 2)
mean(predict(g_svm,testdata$X_test)==testdata$y_test)
mean(predict(g_sup,testdata$X_test)==testdata$y_test)
mean(predict(g_semi,testdata$X_test)==testdata$y_test)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.