compile_model: Compile model

View source: R/create_model_utils.R

compile_modelR Documentation

Compile model

Description

Compile model

Usage

compile_model(
  model,
  solver,
  learning_rate,
  loss_fn,
  label_smoothing = 0,
  num_output_layers = 1,
  label_noise_matrix = NULL,
  bal_acc = FALSE,
  f1_metric = FALSE,
  auc_metric = FALSE,
  layer_dense = NULL
)

Arguments

model

A keras model.

solver

Optimization method, options are ⁠"adam", "adagrad", "rmsprop"⁠ or "sgd".

learning_rate

Learning rate for optimizer.

loss_fn

Either "categorical_crossentropy" or "binary_crossentropy". If label_noise_matrix given, will use custom "noisy_loss".

label_smoothing

Float in [0, 1]. If 0, no smoothing is applied. If > 0, loss between the predicted labels and a smoothed version of the true labels, where the smoothing squeezes the labels towards 0.5. The closer the argument is to 1 the more the labels get smoothed.

num_output_layers

Number of output layers.

label_noise_matrix

Matrix of label noises. Every row stands for one class and columns for percentage of labels in that class. If first label contains 5 percent wrong labels and second label no noise, then

label_noise_matrix <- matrix(c(0.95, 0.05, 0, 1), nrow = 2, byrow = TRUE )

bal_acc

Whether to add balanced accuracy.

f1_metric

Whether to add F1 metric.

auc_metric

Whether to add AUC metric.

layer_dense

Vector specifying number of neurons per dense layer after last LSTM or CNN layer (if no LSTM used).

Value

A compiled keras model.

Examples



model <- create_model_lstm_cnn(layer_lstm = 8, compile = FALSE)
model <- compile_model(model = model,
                       solver = 'adam',
                      learning_rate = 0.01,
                       loss_fn = 'categorical_crossentropy')


GenomeNet/deepG documentation built on Dec. 24, 2024, 12:11 p.m.