cce.loss | Categorical Cross-Entropy Loss function, used for... |
create.mlp | Initialize a Neural Network. |
forward.prop | Performs forward propagation through the neural network Most... |
.ipynb_checkpoints/create.mlp-checkpoint | Initializes a new neural network. |
leaky.relu | A "leaky" RELU function, and derivative. The slope in the... |
linear | A linear activation function, and its derivative. |
logit.loss | A Logistic Loss function, used for binary classification... |
mse.loss | A Mean Squared Error Loss function, used for regression... |
plot.mlp | Plot the Loss or Other Metrics of a Neural Network |
predict.mlp | Make predictions using the neural network. |
relu | A rectified linear unit or RELU = max(0, x) function, and its... |
sigmoid | A sigmoid (= logistic) function, and its derivative. |
softmax | A softmax activation, used for classifier networks as the... |
split.dataset | Split a dataset into a design (X) matrix and labels (Y), and... |
tanh | A hyperbolic tangent function, and its derivative. |
train | Train a Neural Network |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.