knitr::opts_chunk$set(echo = TRUE)
The package is designed to be easily extended. In this vignette, a custom activation will be used. Please read the README and Deep Regression Data vignette before this one.
To create data ideal for a neural network using a custom activation function, only the custom activation function written in R is needed. It should have one argument named x.
The usual relu is linear if x is above zero with no maximum. This custom activation will be linear with a max value of 10.
custom_relu_R <- function(x) { out <- ifelse(x >= 0, x, 0) out <- pmin(out, 10) return(out) }
With the custom activation defined, the remaining code is nearly identical to the README example. Only the activation changes.
library(dplyr, warn.conflicts = FALSE) library(NeuralNetworkSimulatoR) #Weights are 1, 2, and 3 M <- list(matrix(1:3, nrow = 3, ncol = 1)) A <- list(custom_relu_R)
Lets create data with three continous variables and one thousand rows.
set.seed(1) simData <- simulate_regression_data( rows = 1000L, N = 3L, U = 0L, C = 0L, matrices = M, activations = A, noise = 0 ) head(simData)
Using this data, lets train a model and look at the Betas.
library(keras) X <- simData[,1:3] Y <- simData[,4] model <- keras_model_sequential() %>% layer_dense(units = 1, input_shape = 3, use_bias = FALSE, kernel_initializer = initializer_constant(value = 1)) %>% layer_activation_relu(max_value = 10) model %>% compile(loss = loss_mean_squared_error, optimizer = optimizer_rmsprop()) model %>% fit(x = X, y = Y, epochs = 30, batch_size = 10, verbose = FALSE) get_weights(model)
The estimated weights are very close to the true values.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.