Cross-validated linear discriminant calculations determine the optimum number of features. Test and training scores from successive cross-validation steps determine, via a principal components calculation, a low-dimensional global space onto which test scores are projected, in order to plot them. Further functions are included that serve didactic purposes.
|Date of publication||2016-11-03 23:09:33|
|Maintainer||John Maindonald <email@example.com>|
|License||GPL (>= 2)|
accTrainTest: Two subsets of data each take in turn the role of test set
aovFbyrow: calculate aov F-statistic for each row of a matrix
cvdisc: Cross-validated accuracy, in linear discriminant calculations
cvscores: For high-dimensional data with known groups, derive scores...
defectiveCVdisc: defective accuracy assessments from linear discriminant...
divideUp: Partition data into mutiple nearly equal subsets
Golub: Golub data (7129 rows by 72 columns), after normalization
golubInfo: Classifying factors for the 72 columns of the Golub data set
hddplot.package: For high-dimensional data with known groups, derive scores...
orderFeatures: Order features, based on their ability to discriminate
pcp: convenience version of the singular value decomposition
plotTrainTest: Plot predictions for both a I/II train/test split, and the...
qqthin: a version of qqplot() that thins out points that overplot
scoreplot: Plot discriminant function scores, with various...
simulateScores: Generate linear discriminant scores from random data, after...
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.