Description Usage Arguments See Also
Configure a Keras model for training
1 2 3 4 5 6 7 8 9 10 11 12 |
object |
Model object to compile. |
optimizer |
Name of optimizer or optimizer instance. |
loss |
Name of objective function or objective function. If the model has multiple outputs, you can use a different loss on each output by passing a dictionary or a list of objectives. The loss value that will be minimized by the model will then be the sum of all individual losses. |
metrics |
List of metrics to be evaluated by the model during training
and testing. Typically you will use |
loss_weights |
Optional list specifying scalar coefficients to weight
the loss contributions of different model outputs. The loss value that will
be minimized by the model will then be the weighted sum of all indvidual
losses, weighted by the |
sample_weight_mode |
If you need to do timestep-wise sample weighting
(2D weights), set this to "temporal". |
weighted_metrics |
List of metrics to be evaluated and weighted by sample_weight or class_weight during training and testing |
target_tensors |
By default, Keras will create a placeholder for the
model's target, which will be fed with the target data during
training. If instead you would like to use your own
target tensor (in turn, Keras will not expect external
data for these targets at training time), you
can specify them via the |
... |
When using the Theano/CNTK backends, these arguments
are passed into K.function. When using the TensorFlow backend,
these arguments are passed into |
Other model functions:
evaluate.keras.engine.training.Model()
,
evaluate_generator()
,
fit.keras.engine.training.Model()
,
fit_generator()
,
get_config()
,
get_layer()
,
keras_model_sequential()
,
keras_model()
,
multi_gpu_model()
,
pop_layer()
,
predict.keras.engine.training.Model()
,
predict_generator()
,
predict_on_batch()
,
predict_proba()
,
summary.keras.engine.training.Model()
,
train_on_batch()
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.