kernDeepStackNet: Kernel Deep Stacking Networks

Contains functions for estimation and model selection of kernel deep stacking networks. The model selection includes direct optimization or model based alternatives with arbitrary loss functions.

Install the latest version of this package by entering the following in R:
install.packages("kernDeepStackNet")
AuthorThomas Welchowski <welchow@imbie.meb.uni-bonn.de> and Matthias Schmid <matthias.schmid@imbie.uni-bonn.de>
Date of publication2017-02-08 01:30:30
MaintainerThomas Welchowski <welchow@imbie.meb.uni-bonn.de>
LicenseGPL-3
Version2.0.1

View on CRAN

Man pages

calcTrA: Calculates the trace of the hat matrix

calcTrAFast: Calculates the trace of the hat matrix as C version

calcWdiag: Calculation of weight matrix

cancorRed: Calculate first canonical correlation

crossprodRcpp: Calculates the cross product of a matrix

devStandard: Predictive deviance of a linear model

EImod: Expected improvement criterion replacement function

fineTuneCvKDSN: Fine tuning of random weights of a given KDSN model

fitEnsembleKDSN: Fit an ensemble of KDSN

fitKDSN: Fit kernel deep stacking network with random Fourier...

fourierTransPredict: Prediction based on random Fourier transformation

gDerivMu: Derivative of the link function evaluated at the expected...

getEigenValuesRcpp: Calculates the eigenvalues of a matrix

kernDeepStackNet-package: Kernel deep stacking networks with random Fourier...

lossApprox: Kernel deep stacking network loss function

lossCvKDSN: Kernel deep stacking network loss function based on...

lossGCV: Generalized cross-validation loss

lossSharedCvKDSN: Kernel deep stacking network loss function based on...

lossSharedTestKDSN: Kernel deep stacking network loss function with test set and...

mbo1d: Efficient global optimization with iterative point proposals

mboAll: Efficient global optimization inclusive meta model validation

optimize1dMulti: One dimensional optimization of multivariate loss functions

predict.KDSN: Predict kernel deep stacking networks

predict.KDSNensemble: Predict kernel deep stacking networks ensembles

predict.KDSNensembleDisk: Predict kernel deep stacking networks ensembles

predLogProb: Predictive logarithmic probability of Kriging model

randomFourierTrans: Random Fourier transformation

rdcPart: Randomized dependence coefficient partial calculation

rdcSubset: Randomized dependence coefficients score on given subset

rdcVarOrder: Variable ordering using randomized dependence coefficients...

rdcVarSelSubset: Variable selection based on RDC with genetic algorithm

robustStandard: Robust standardization

tuneMboLevelCvKDSN: Tuning of KDSN with efficient global optimization given level...

tuneMboLevelGcvKDSN: Tuning of KDSN with efficient global optimization given level...

tuneMboSharedCvKDSN: Tuning of KDSN with efficient global optimization given level...

tuneMboSharedSubsetKDSN: Tuning subsets of KDSN with efficient global optimization and...

varMu: Variance function evaluated at expected value

Functions

calcTrA Man page
calcTrAFast Man page
calcWdiag Man page
cancorRed Man page
crossprodRcpp Man page
devStandard Man page
EImod Man page
fineTuneCvKDSN Man page
fitEnsembleKDSN Man page
fitKDSN Man page
fourierTransPredict Man page
gDerivMu Man page
getEigenValuesRcpp Man page
kernDeepStackNet Man page
kernDeepStackNet-package Man page
lossApprox Man page
lossCvKDSN Man page
lossGCV Man page
lossSharedCvKDSN Man page
lossSharedTestKDSN Man page
mbo1d Man page
mboAll Man page
optimize1dMulti Man page
predict.KDSN Man page
predict.KDSNensemble Man page
predict.KDSNensembleDisk Man page
predLogProb Man page
randomFourierTrans Man page
rdcPart Man page
rdcSubset Man page
rdcVarOrder Man page
rdcVarSelSubset Man page
robustStandard Man page
tuneMboLevelCvKDSN Man page
tuneMboLevelGcvKDSN Man page
tuneMboSharedCvKDSN Man page
tuneMboSharedSubsetKDSN Man page
varMu Man page

Questions? Problems? Suggestions? or email at ian@mutexlabs.com.

Please suggest features or report bugs with the GitHub issue tracker.

All documentation is copyright its authors; we didn't write any of that.