softmaxreg: Training Multi-Layer Neural Network for Softmax Regression and Classification

Implementation of 'softmax' regression and classification models with multiple layer neural network. It can be used for many tasks like word embedding based document classification, 'MNIST' dataset handwritten digit recognition and so on. Multiple optimization algorithm including 'SGD', 'Adagrad', 'RMSprop', 'Moment', 'NAG', etc are also provided.

AuthorXichen Ding <rockingdingo@gmail.com>
Date of publication2016-09-09 12:37:04
MaintainerXichen Ding <rockingdingo@gmail.com>
LicenseGPL (>= 2)
Version1.2

View on CRAN

Functions

activate Man page
activateDeri Man page
addL2GradPenalty Man page
addL2LossPenalty Man page
addList Man page
AIC Man page
AIC.softmax Man page
BIC Man page
BIC.softmax Man page
calcGrad Man page
calcInfoCriteria Man page
calcLoss Man page
checkGradMode Man page
convertClass2Matrix Man page
convertList2Matrix Man page
copyShape Man page
divideList Man page
divideListByList Man page
dividePara Man page
document Man page
forwardProp Man page
forwardPropParallel Man page
getLogValue Man page
getRandomBatch Man page
gradientDescent Man page
initPara Man page
isInteger Man page
load_image_file Man page
load_label_file Man page
loadURLData Man page
MSELoss Man page
multiplyList Man page
multiplyListByList Man page
predict.softmax Man page
ReLuDeri Man page
ReLuFunc Man page
rootList Man page
show_digit Man page
sigmoid Man page
sigmoidDeri Man page
softmax Man page
softmax-class Man page
softmaxFunc Man page
softmaxLoss Man page
$,softmax-method Man page
softmaxProp Man page
softmaxReg Man page
softmaxReg.default Man page
softmaxReg.formula Man page
squareList Man page
subtractList Man page
summary.softmax Man page
tanhDeri Man page
tanhFunc Man page
trainModel Man page
updatePara Man page
word2vec Man page
wordEmbed Man page

Questions? Problems? Suggestions? or email at ian@mutexlabs.com.

Please suggest features or report bugs with the GitHub issue tracker.

All documentation is copyright its authors; we didn't write any of that.