SimpleMKL.classification: Simple MKL

Description Usage Arguments Value Examples

View source: R/SimpleMKL.Classification.R

Description

This function conducts Simple MKL for precomputed gramm matrices

Usage

1
2
SimpleMKL.classification(k, outcome, penalty, tol = 10^(-4),
  max.iters = 1000)

Arguments

k

list of Gramm matrices

outcome

vector of binary outcome -1 and 1

penalty

ppenalty of the smoothness of the resulting desicion rules

tol

change between to iterations is smaller than this, algorithms is considered to have converged

max.iters

maximum number of allowed iteratons

Value

gamma weight vector for the importnace of each kernel

alpha coeffiencents of the dual of MKL

time total amount of time to train model

max.iters Numvber of iterations to reach convergence criteria

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
library(kernlab)
library(caret)
library(RMKL)
#Load data
data(benchmark.data)
example.data=benchmark.data[[1]]
# Split samples into training and test sets 
training.samples=sample(1:dim(example.data)[1],floor(0.7*dim(example.data)[1]),replace=FALSE)
# Set up cost parameters and kernels 
C=100
kernels=rep('radial',3)
degree=rep(0,3)
scale=rep(0,3)
sigma=c(0,2^seq(-3:0))
K=kernels.gen(example.data[,1:2], training.samples, kernels, degree, scale, sigma)
K.train=K$K.train
SimpleMKL.classification(K.train,example.data[training.samples,3], C)

Example output

Loading required package: lattice
Loading required package: ggplot2

Attaching package:ggplot2The following object is masked frompackage:kernlab:

    alpha

$gamma
[1] 0 1 0

$iters
[1] 2

$alpha
  [1] 2.145553e-03 4.408910e-05 3.305602e+01 9.999992e+01 1.510327e-04
  [6] 1.006381e+01 5.955634e+01 1.367328e-04 7.490723e-05 9.357457e+01
 [11] 3.009480e-05 9.999999e+01 3.355234e+01 4.510966e-05 4.157061e+01
 [16] 4.322098e-05 2.878140e+01 3.517140e+01 5.006819e-05 8.655379e+00
 [21] 7.704790e+01 3.096216e-03 4.199576e-04 3.663009e-04 9.485126e+01
 [26] 2.022167e-01 1.063272e-05 3.830590e+01 5.403788e+01 4.769932e-05
 [31] 4.550526e+01 2.488669e+01 4.254738e-05 9.330425e-05 1.000000e+02
 [36] 1.938196e+01 7.059709e-06 2.879869e-04 1.000000e+02 1.881237e+01
 [41] 9.999999e+01 7.290926e+01 1.368251e-05 1.833999e-03 2.513803e+00
 [46] 1.215574e+01 6.457470e+01 1.224067e-04 1.723929e+01 1.491627e+01
 [51] 1.192191e-05 4.693348e+01 8.235141e-05 9.999999e+01 9.195460e+00
 [56] 2.384703e+01 1.000000e+02 3.568320e+01 3.981720e-05 6.119435e+01
 [61] 1.099335e-04 9.335132e+01 3.283527e+01 1.831674e-05 5.352280e-05
 [66] 3.922601e+01 2.661056e+01 9.999997e+01 1.136354e-05 1.297337e-02
 [71] 3.788881e-05 2.106342e-05 1.745993e+01 8.288365e+01 1.675527e-05
 [76] 8.198266e+01 8.287029e+01 1.229353e+01 5.506263e+01 2.461231e+01
 [81] 5.173841e+00 2.725858e+01 8.698796e+01 2.131988e+01 1.363580e+01
 [86] 3.905219e-05 3.075299e+01 1.188970e-05 7.514731e-04 1.984472e+01
 [91] 3.463592e-05 1.579760e+01 9.631525e-06 1.191972e-04 1.361960e-04
 [96] 1.528748e-05 7.926626e+01 3.415773e-04 8.772297e-04 1.000000e+02
[101] 3.373321e+01 1.232445e-04 9.999999e+01 9.306351e+01 4.843026e+01
[106] 3.344064e+01 6.260597e+01 4.263392e-05 3.717509e+01 1.000000e+02
[111] 1.962077e+01 7.166229e+01 6.048994e+00 2.966697e-05 3.720834e-05
[116] 3.955233e-04 2.736186e+01 1.848997e-05 3.661057e-05 6.439493e+01
[121] 2.902792e+01 8.906003e-06 9.999999e+01 3.642794e+01 1.666375e-04
[126] 7.206920e+01 4.882937e+01 1.023735e+01 2.965309e+01 3.079292e+01
[131] 1.740635e-05 2.116364e-04 9.999998e+01 3.207111e+01 2.579153e+01
[136] 8.814107e-06 6.054023e-05 1.096759e-04 8.122470e+01 4.045217e+01

$b
[1] 0.03714253

$gamma_all
$gamma_all[[1]]
[1] 0.3333333 0.3333333 0.3333333

$gamma_all[[2]]
[1] 0.0000000 0.5394908 0.4605092

$gamma_all[[3]]
[1] 0 1 0

RMKL documentation built on May 2, 2019, 7:55 a.m.