These models are included in the package via wrappers for train
. Custom models can also be created. See the URL below.
Bagged CART (method = 'treebag'
)
For classification and regression using packages ipred and plyr with no tuning parameters
Bagged Flexible Discriminant Analysis (method = 'bagFDA'
)
For classification using packages earth and mda with tuning parameters:
Product Degree (degree
)
Number of Terms (nprune
)
Bagged Logic Regression (method = 'logicBag'
)
For classification and regression using package logicFS with tuning parameters:
Maximum Number of Leaves (nleaves
)
Number of Trees (ntrees
)
Bagged MARS (method = 'bagEarth'
)
For classification and regression using package earth with tuning parameters:
Number of Terms (nprune
)
Product Degree (degree
)
Bagged Model (method = 'bag'
)
For classification and regression using package caret with tuning parameters:
Number of Randomly Selected Predictors (vars
)
Bayesian Generalized Linear Model (method = 'bayesglm'
)
For classification and regression using package arm with no tuning parameters
Bayesian Regularized Neural Networks (method = 'brnn'
)
For regression using package brnn with tuning parameters:
Number of Neurons (neurons
)
Boosted Classification Trees (method = 'ada'
)
For classification using package ada with tuning parameters:
Number of Trees (iter
)
Max Tree Depth (maxdepth
)
Learning Rate (nu
)
Boosted Generalized Additive Model (method = 'gamboost'
)
For classification and regression using package mboost with tuning parameters:
Number of Boosting Iterations (mstop
)
AIC Prune? (prune
)
Boosted Generalized Linear Model (method = 'glmboost'
)
For classification and regression using package mboost with tuning parameters:
Number of Boosting Iterations (mstop
)
AIC Prune? (prune
)
Boosted Linear Model (method = 'bstLs'
)
For classification and regression using packages bst and plyr with tuning parameters:
Number of Boosting Iterations (mstop
)
Shrinkage (nu
)
Boosted Logistic Regression (method = 'LogitBoost'
)
For classification using package caTools with tuning parameters:
Number of Boosting Iterations (nIter
)
Boosted Smoothing Spline (method = 'bstSm'
)
For classification and regression using packages bst and plyr with tuning parameters:
Number of Boosting Iterations (mstop
)
Shrinkage (nu
)
Boosted Tree (method = 'blackboost'
)
For classification and regression using packages party, mboost and plyr with tuning parameters:
Number of Trees (mstop
)
Max Tree Depth (maxdepth
)
Boosted Tree (method = 'bstTree'
)
For classification and regression using packages bst and plyr with tuning parameters:
Number of Boosting Iterations (mstop
)
Max Tree Depth (maxdepth
)
Shrinkage (nu
)
C4.5-like Trees (method = 'J48'
)
For classification using package RWeka with tuning parameters:
Confidence Threshold (C
)
C5.0 (method = 'C5.0'
)
For classification using packages C50 and plyr with tuning parameters:
Number of Boosting Iterations (trials
)
Model Type (model
)
Winnow (winnow
)
CART (method = 'rpart'
)
For classification and regression using package rpart with tuning parameters:
Complexity Parameter (cp
)
CART (method = 'rpart2'
)
For classification and regression using package rpart with tuning parameters:
Max Tree Depth (maxdepth
)
Conditional Inference Random Forest (method = 'cforest'
)
For classification and regression using package party with tuning parameters:
Number of Randomly Selected Predictors (mtry
)
Conditional Inference Tree (method = 'ctree'
)
For classification and regression using package party with tuning parameters:
1 - P-Value Threshold (mincriterion
)
Conditional Inference Tree (method = 'ctree2'
)
For classification and regression using package party with tuning parameters:
Max Tree Depth (maxdepth
)
Cost-Sensitive C5.0 (method = 'C5.0Cost'
)
For classification using packages C50 and plyr with tuning parameters:
Number of Boosting Iterations (trials
)
Model Type (model
)
Winnow (winnow
)
Cost (cost
)
Cost-Sensitive CART (method = 'rpartCost'
)
For classification using package rpart with tuning parameters:
Complexity Parameter (cp
)
Cost (Cost
)
Cubist (method = 'cubist'
)
For regression using package Cubist with tuning parameters:
Number of Committees (committees
)
Number of Instances (neighbors
)
Elasticnet (method = 'enet'
)
For regression using package elasticnet with tuning parameters:
Fraction of Full Solution (fraction
)
Weight Decay (lambda
)
Extreme Learning Machine (method = 'elm'
)
For classification and regression using package elmNN with tuning parameters:
Number of Hidden Units (nhid
)
Activation Function (actfun
)
Factor-Based Linear Discriminant Analysis (method = 'RFlda'
)
For classification using package HiDimDA with tuning parameters:
Number of Factors (q
)
Flexible Discriminant Analysis (method = 'fda'
)
For classification using packages earth and mda with tuning parameters:
Product Degree (degree
)
Number of Terms (nprune
)
Gaussian Process (method = 'gaussprLinear'
)
For classification and regression using package kernlab with no tuning parameters
Gaussian Process with Polynomial Kernel (method = 'gaussprPoly'
)
For classification and regression using package kernlab with tuning parameters:
Polynomial Degree (degree
)
Scale (scale
)
Gaussian Process with Radial Basis Function Kernel (method = 'gaussprRadial'
)
For classification and regression using package kernlab with tuning parameters:
Sigma (sigma
)
Generalized Additive Model using LOESS (method = 'gamLoess'
)
For classification and regression using package gam with tuning parameters:
Span (span
)
Degree (degree
)
Generalized Additive Model using Splines (method = 'gam'
)
For classification and regression using package mgcv with tuning parameters:
Feature Selection (select
)
Method (method
)
Generalized Additive Model using Splines (method = 'gamSpline'
)
For classification and regression using package gam with tuning parameters:
Degrees of Freedom (df
)
Generalized Linear Model (method = 'glm'
)
For classification and regression with no tuning parameters
Generalized Linear Model with Stepwise Feature Selection (method = 'glmStepAIC'
)
For classification and regression using package MASS with no tuning parameters
Generalized Partial Least Squares (method = 'gpls'
)
For classification using package gpls with tuning parameters:
Number of Components (K.prov
)
glmnet (method = 'glmnet'
)
For classification and regression using package glmnet with tuning parameters:
Mixing Percentage (alpha
)
Regularization Parameter (lambda
)
Greedy Prototype Selection (method = 'protoclass'
)
For classification using packages proxy and protoclass with tuning parameters:
Ball Size (eps
)
Distance Order (Minkowski
)
Heteroscedastic Discriminant Analysis (method = 'hda'
)
For classification using package hda with tuning parameters:
Gamma (gamma
)
Lambda (lambda
)
Dimension of the Discriminative Subspace (newdim
)
High Dimensional Discriminant Analysis (method = 'hdda'
)
For classification using package HDclassif with tuning parameters:
Threshold (threshold
)
Model Type (model
)
Independent Component Regression (method = 'icr'
)
For regression using package fastICA with tuning parameters:
Number of Components (n.comp
)
k-Nearest Neighbors (method = 'kknn'
)
For classification and regression using package kknn with tuning parameters:
Max. Number of Neighbors (kmax
)
Distance (distance
)
Kernel (kernel
)
k-Nearest Neighbors (method = 'knn'
)
For classification and regression with tuning parameters:
Number of Neighbors (k
)
Learning Vector Quantization (method = 'lvq'
)
For classification using package class with tuning parameters:
Codebook Size (size
)
Number of Prototypes (k
)
Least Angle Regression (method = 'lars'
)
For regression using package lars with tuning parameters:
Fraction (fraction
)
Least Angle Regression (method = 'lars2'
)
For regression using package lars with tuning parameters:
Number of Steps (step
)
Least Squares Support Vector Machine (method = 'lssvmLinear'
)
For classification using package kernlab with no tuning parameters
Least Squares Support Vector Machine with Polynomial Kernel (method = 'lssvmPoly'
)
For classification using package kernlab with tuning parameters:
Polynomial Degree (degree
)
Scale (scale
)
Least Squares Support Vector Machine with Radial Basis Function Kernel (method = 'lssvmRadial'
)
For classification using package kernlab with tuning parameters:
Sigma (sigma
)
Linear Discriminant Analysis (method = 'lda'
)
For classification using package MASS with no tuning parameters
Linear Discriminant Analysis (method = 'lda2'
)
For classification using package MASS with tuning parameters:
Number of Discriminant Functions (dimen
)
Linear Discriminant Analysis with Stepwise Feature Selection (method = 'stepLDA'
)
For classification using packages klaR and MASS with tuning parameters:
Maximum Number of Variables (maxvar
)
Search Direction (direction
)
Linear Regression (method = 'lm'
)
For regression with no tuning parameters
Linear Regression with Backwards Selection (method = 'leapBackward'
)
For regression using package leaps with tuning parameters:
Maximum Number of Predictors (nvmax
)
Linear Regression with Forward Selection (method = 'leapForward'
)
For regression using package leaps with tuning parameters:
Maximum Number of Predictors (nvmax
)
Linear Regression with Stepwise Selection (method = 'leapSeq'
)
For regression using package leaps with tuning parameters:
Maximum Number of Predictors (nvmax
)
Linear Regression with Stepwise Selection (method = 'lmStepAIC'
)
For regression using package MASS with no tuning parameters
Logic Regression (method = 'logreg'
)
For classification and regression using package LogicReg with tuning parameters:
Maximum Number of Leaves (treesize
)
Number of Trees (ntrees
)
Logistic Model Trees (method = 'LMT'
)
For classification using package RWeka with tuning parameters:
Number of Iteratons (iter
)
Maximum Uncertainty Linear Discriminant Analysis (method = 'Mlda'
)
For classification using package HiDimDA with no tuning parameters
Mixture Discriminant Analysis (method = 'mda'
)
For classification using package mda with tuning parameters:
Number of Subclasses Per Class (subclasses
)
Model Averaged Neural Network (method = 'avNNet'
)
For classification and regression using package nnet with tuning parameters:
Number of Hidden Units (size
)
Weight Decay (decay
)
Bagging (bag
)
Model Rules (method = 'M5Rules'
)
For regression using package RWeka with tuning parameters:
Pruned (pruned
)
Smoothed (smoothed
)
Model Tree (method = 'M5'
)
For regression using package RWeka with tuning parameters:
Pruned (pruned
)
Smoothed (smoothed
)
Rules (rules
)
Multi-Layer Perceptron (method = 'mlp'
)
For classification and regression using package RSNNS with tuning parameters:
Number of Hidden Units (size
)
Multi-Layer Perceptron (method = 'mlpWeightDecay'
)
For classification and regression using package RSNNS with tuning parameters:
Number of Hidden Units (size
)
Weight Decay (decay
)
Multivariate Adaptive Regression Spline (method = 'earth'
)
For classification and regression using package earth with tuning parameters:
Number of Terms (nprune
)
Product Degree (degree
)
Multivariate Adaptive Regression Splines (method = 'gcvEarth'
)
For classification and regression using package earth with tuning parameters:
Product Degree (degree
)
Naive Bayes (method = 'nb'
)
For classification using package klaR with tuning parameters:
Laplace Correction (fL
)
Distribution Type (usekernel
)
Nearest Shrunken Centroids (method = 'pam'
)
For classification using package pamr with tuning parameters:
Shrinkage Threshold (threshold
)
Neural Network (method = 'neuralnet'
)
For regression using package neuralnet with tuning parameters:
Number of Hidden Units in Layer 1 (layer1
)
Number of Hidden Units in Layer 2 (layer2
)
Number of Hidden Units in Layer 3 (layer3
)
Neural Network (method = 'nnet'
)
For classification and regression using package nnet with tuning parameters:
Number of Hidden Units (size
)
Weight Decay (decay
)
Neural Networks with Feature Extraction (method = 'pcaNNet'
)
For classification and regression using package nnet with tuning parameters:
Number of Hidden Units (size
)
Weight Decay (decay
)
Oblique Random Forest (method = 'ORFlog'
)
For classification using package obliqueRF with tuning parameters:
Number of Randomly Selected Predictors (mtry
)
Oblique Random Forest (method = 'ORFpls'
)
For classification using package obliqueRF with tuning parameters:
Number of Randomly Selected Predictors (mtry
)
Oblique Random Forest (method = 'ORFridge'
)
For classification using package obliqueRF with tuning parameters:
Number of Randomly Selected Predictors (mtry
)
Oblique Random Forest (method = 'ORFsvm'
)
For classification using package obliqueRF with tuning parameters:
Number of Randomly Selected Predictors (mtry
)
Oblique Trees (method = 'oblique.tree'
)
For classification using package oblique.tree with tuning parameters:
Oblique Splits (oblique.splits
)
Variable Selection Method (variable.selection
)
Parallel Random Forest (method = 'parRF'
)
For classification and regression using package randomForest with tuning parameters:
Number of Randomly Selected Predictors (mtry
)
partDSA (method = 'partDSA'
)
For classification and regression using package partDSA with tuning parameters:
Number of Terminal Partitions (cut.off.growth
)
Minimum Percent Difference (MPD
)
Partial Least Squares (method = 'kernelpls'
)
For classification and regression using package pls with tuning parameters:
Number of Components (ncomp
)
Partial Least Squares (method = 'pls'
)
For classification and regression using package pls with tuning parameters:
Number of Components (ncomp
)
Partial Least Squares (method = 'simpls'
)
For classification and regression using package pls with tuning parameters:
Number of Components (ncomp
)
Partial Least Squares (method = 'widekernelpls'
)
For classification and regression using package pls with tuning parameters:
Number of Components (ncomp
)
Penalized Discriminant Analysis (method = 'pda'
)
For classification using package mda with tuning parameters:
Shrinkage Penalty Coefficient (lambda
)
Penalized Discriminant Analysis (method = 'pda2'
)
For classification using package mda with tuning parameters:
Degrees of Freedom (df
)
Penalized Linear Discriminant Analysis (method = 'PenalizedLDA'
)
For classification using packages penalizedLDA and plyr with tuning parameters:
L1 Penalty (lambda
)
Number of Discriminant Functions (K
)
Penalized Linear Regression (method = 'penalized'
)
For regression using package penalized with tuning parameters:
L1 Penalty (lambda1
)
L2 Penalty (lambda2
)
Penalized Logistic Regression (method = 'plr'
)
For classification using package stepPlr with tuning parameters:
L2 Penalty (lambda
)
Complexity Parameter (cp
)
Penalized Multinomial Regression (method = 'multinom'
)
For classification using package nnet with tuning parameters:
Weight Decay (decay
)
Polynomial Kernel Regularized Least Squares (method = 'krlsPoly'
)
For regression using package KRLS with tuning parameters:
Regularization Parameter (lambda
)
Polynomial Degree (degree
)
Principal Component Analysis (method = 'pcr'
)
For regression using package pls with tuning parameters:
Number of Components (ncomp
)
Projection Pursuit Regression (method = 'ppr'
)
For regression with tuning parameters:
Number of Terms (nterms
)
Quadratic Discriminant Analysis (method = 'qda'
)
For classification using package MASS with no tuning parameters
Quadratic Discriminant Analysis with Stepwise Feature Selection (method = 'stepQDA'
)
For classification using packages klaR and MASS with tuning parameters:
Maximum Number of Variables (maxvar
)
Search Direction (direction
)
Quantile Random Forest (method = 'qrf'
)
For regression using package quantregForest with tuning parameters:
Number of Randomly Selected Predictors (mtry
)
Quantile Regression Neural Network (method = 'qrnn'
)
For regression using package qrnn with tuning parameters:
Number of Hidden Units (n.hidden
)
Weight Decay (penalty
)
Bagged Models? (bag
)
Radial Basis Function Kernel Regularized Least Squares (method = 'krlsRadial'
)
For regression using packages KRLS and kernlab with tuning parameters:
Regularization Parameter (lambda
)
Sigma (sigma
)
Radial Basis Function Network (method = 'rbf'
)
For classification using package RSNNS with tuning parameters:
Number of Hidden Units (size
)
Radial Basis Function Network (method = 'rbfDDA'
)
For classification and regression using package RSNNS with tuning parameters:
Activation Limit for Conflicting Classes (negativeThreshold
)
Random Ferns (method = 'rFerns'
)
For classification using package rFerns with tuning parameters:
Fern Depth (depth
)
Random Forest (method = 'rf'
)
For classification and regression using package randomForest with tuning parameters:
Number of Randomly Selected Predictors (mtry
)
Random Forest by Randomization (method = 'extraTrees'
)
For classification and regression using package extraTrees with tuning parameters:
Number of Randomly Selected Predictors (mtry
)
Number of Random Cuts (numRandomCuts
)
Random Forest with Additional Feature Selection (method = 'Boruta'
)
For classification and regression using packages Boruta and randomForest with tuning parameters:
Number of Randomly Selected Predictors (mtry
)
Random k-Nearest Neighbors (method = 'rknn'
)
For classification and regression using package rknn with tuning parameters:
Number of Neighbors (k
)
Number of Randomly Selected Predictors (mtry
)
Random k-Nearest Neighbors with Feature Selection (method = 'rknnBel'
)
For classification and regression using packages rknn and plyr with tuning parameters:
Number of Neighbors (k
)
Number of Randomly Selected Predictors (mtry
)
Number of Features Dropped (d
)
Regularized Discriminant Analysis (method = 'rda'
)
For classification using package klaR with tuning parameters:
Gamma (gamma
)
Lambda (lambda
)
Regularized Random Forest (method = 'RRF'
)
For classification and regression using packages randomForest and RRF with tuning parameters:
Number of Randomly Selected Predictors (mtry
)
Regularization Value (coefReg
)
Importance Coefficient (coefImp
)
Regularized Random Forest (method = 'RRFglobal'
)
For classification and regression using package RRF with tuning parameters:
Number of Randomly Selected Predictors (mtry
)
Regularization Value (coefReg
)
Relaxed Lasso (method = 'relaxo'
)
For regression using packages relaxo and plyr with tuning parameters:
Penalty Parameter (lambda
)
Relaxation Parameter (phi
)
Relevance Vector Machines with Linear Kernel (method = 'rvmLinear'
)
For regression using package kernlab with no tuning parameters
Relevance Vector Machines with Polynomial Kernel (method = 'rvmPoly'
)
For regression using package kernlab with tuning parameters:
Scale (scale
)
Polynomial Degree (degree
)
Relevance Vector Machines with Radial Basis Function Kernel (method = 'rvmRadial'
)
For regression using package kernlab with tuning parameters:
Sigma (sigma
)
Ridge Regression (method = 'ridge'
)
For regression using package elasticnet with tuning parameters:
Weight Decay (lambda
)
Ridge Regression with Variable Selection (method = 'foba'
)
For regression using package foba with tuning parameters:
Number of Variables Retained (k
)
L2 Penalty (lambda
)
Robust Linear Discriminant Analysis (method = 'Linda'
)
For classification using package rrcov with no tuning parameters
Robust Linear Model (method = 'rlm'
)
For regression using package MASS with no tuning parameters
Robust Quadratic Discriminant Analysis (method = 'QdaCov'
)
For classification using package rrcov with no tuning parameters
Robust Regularized Linear Discriminant Analysis (method = 'rrlda'
)
For classification using package rrlda with tuning parameters:
Penalty Parameter (lambda
)
Robustness Parameter (hp
)
Penalty Type (penalty
)
Robust SIMCA (method = 'RSimca'
)
For classification using package rrcovHD with no tuning parameters
ROC-Based Classifier (method = 'rocc'
)
For classification using package rocc with tuning parameters:
Number of Variables Retained (xgenes
)
Rule-Based Classifier (method = 'JRip'
)
For classification using package RWeka with tuning parameters:
Number of Optimizations (NumOpt
)
Rule-Based Classifier (method = 'PART'
)
For classification using package RWeka with tuning parameters:
Confidence Threshold (threshold
)
Confidence Threshold (pruned
)
Self-Organizing Map (method = 'bdk'
)
For classification and regression using package kohonen with tuning parameters:
Row (xdim
)
Columns (ydim
)
X Weight (xweight
)
Topology (topo
)
Self-Organizing Maps (method = 'xyf'
)
For classification and regression using package kohonen with tuning parameters:
Row (xdim
)
Columns (ydim
)
X Weight (xweight
)
Topology (topo
)
Shrinkage Discriminant Analysis (method = 'sda'
)
For classification using package sda with tuning parameters:
Diagonalize (diagonal
)
shrinkage (lambda
)
SIMCA (method = 'CSimca'
)
For classification using package rrcovHD with no tuning parameters
Single C5.0 Ruleset (method = 'C5.0Rules'
)
For classification using package C50 with no tuning parameters
Single C5.0 Tree (method = 'C5.0Tree'
)
For classification using package C50 with no tuning parameters
Single Rule Classification (method = 'OneR'
)
For classification using package RWeka with no tuning parameters
Sparse Linear Discriminant Analysis (method = 'sparseLDA'
)
For classification using package sparseLDA with tuning parameters:
Number of Predictors (NumVars
)
Lambda (lambda
)
Sparse Mixture Discriminant Analysis (method = 'smda'
)
For classification using package sparseLDA with tuning parameters:
Number of Predictors (NumVars
)
Lambda (lambda
)
Number of Subclasses (R
)
Sparse Partial Least Squares (method = 'spls'
)
For classification and regression using package spls with tuning parameters:
Number of Components (K
)
Threshold (eta
)
Kappa (kappa
)
Stabilized Linear Discriminant Analysis (method = 'slda'
)
For classification using package ipred with no tuning parameters
Stacked AutoEncoder Deep Neural Network (method = 'dnn'
)
For classification and regression using package deepnet with tuning parameters:
Hidden Layer 1 (layer1
)
Hidden Layer 2 (layer2
)
Hidden Layer 3 (layer3
)
Hidden Dropouts (hidden_dropout
)
Visible Dropout (visible_dropout
)
Stepwise Diagonal Linear Discriminant Analysis (method = 'sddaLDA'
)
For classification using package SDDA with no tuning parameters
Stepwise Diagonal Quadratic Discriminant Analysis (method = 'sddaQDA'
)
For classification using package SDDA with no tuning parameters
Stochastic Gradient Boosting (method = 'gbm'
)
For classification and regression using packages gbm and plyr with tuning parameters:
Number of Boosting Iterations (n.trees
)
Max Tree Depth (interaction.depth
)
Shrinkage (shrinkage
)
Supervised Principal Component Analysis (method = 'superpc'
)
For regression using package superpc with tuning parameters:
Threshold (threshold
)
Number of Components (n.components
)
Support Vector Machines with Class Weights (method = 'svmRadialWeights'
)
For classification using package kernlab with tuning parameters:
Sigma (sigma
)
Cost (C
)
Weight (Weight
)
Support Vector Machines with Linear Kernel (method = 'svmLinear'
)
For classification and regression using package kernlab with tuning parameters:
Cost (C
)
Support Vector Machines with Polynomial Kernel (method = 'svmPoly'
)
For classification and regression using package kernlab with tuning parameters:
Polynomial Degree (degree
)
Cost (scale
)
Scale (C
)
Support Vector Machines with Radial Basis Function Kernel (method = 'svmRadial'
)
For classification and regression using package kernlab with tuning parameters:
Sigma (sigma
)
Cost (C
)
Support Vector Machines with Radial Basis Function Kernel (method = 'svmRadialCost'
)
For classification and regression using package kernlab with tuning parameters:
Cost (C
)
The lasso (method = 'lasso'
)
For regression using package elasticnet with tuning parameters:
Fraction of Full Solution (fraction
)
Tree Models from Genetic Algorithms (method = 'evtree'
)
For classification and regression using package evtree with tuning parameters:
Complexity Parameter (alpha
)
Tree-Based Ensembles (method = 'nodeHarvest'
)
For classification and regression using package nodeHarvest with tuning parameters:
Maximum Interaction Depth (maxinter
)
Prediction Mode (mode
)
Variational Bayesian Multinomial Probit Regression (method = 'vbmpRadial'
)
For classification using package vbmp with tuning parameters:
Theta Estimated (estimateTheta
)
“Using your own model in train
” (http://caret.r-forge.r-project.org/custom_models.html)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.