Description Usage Arguments References
This operation computes the cross entropy between the target_vector
and the softmax of the output_vector
.
The elements of target_vector
have to be non-negative and should sum
to 1.
The output_vector
can contain any values.
The function will internally compute the softmax of the output_vector
.
1 2 | loss_cross_entropy_with_softmax(output_vector, target_vector, axis = -1,
name = "")
|
output_vector |
unscaled computed output values from the network |
target_vector |
one-hot encoded vector of target values |
axis |
integer (optional) for axis to compute cross-entropy |
name |
string (optional) the name of the Function instance in the network |
https://www.cntk.ai/pythondocs/cntk.losses.html#cntk.losses.cross_entropy_with_softmax
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.