loss_contrastive: Contrastive loss

Description Usage Arguments Details Value Examples

View source: R/losses.R

Description

Computes the contrastive loss between 'y_true' and 'y_pred'.

Usage

1
2
3
4
5
loss_contrastive(
  margin = 1,
  reduction = tf$keras$losses$Reduction$SUM_OVER_BATCH_SIZE,
  name = "contrasitve_loss"
)

Arguments

margin

Float, margin term in the loss definition. Default value is 1.0.

reduction

(Optional) Type of tf$keras$losses$Reduction to apply. Default value is SUM_OVER_BATCH_SIZE.

name

(Optional) name for the loss.

Details

This loss encourages the embedding to be close to each other for the samples of the same label and the embedding to be far apart at least by the margin constant for the samples of different labels. The euclidean distances 'y_pred' between two embedding matrices 'a' and 'b' with shape [batch_size, hidden_size] can be computed as follows: “'python # y_pred = '\sqrt' ('\sum_i' (a[:, i] - b[:, i])^2) y_pred = tf$linalg.norm(a - b, axis=1) “' See: http://yann.lecun.com/exdb/publis/pdf/hadsell-chopra-lecun-06.pdf

Value

contrastive_loss: 1-D float 'Tensor' with shape [batch_size].

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
## Not run: 
keras_model_sequential() %>%
  layer_dense(4, input_shape = c(784)) %>%
  compile(
    optimizer = 'sgd',
    loss=loss_contrastive(),
    metrics='accuracy'
  )

## End(Not run)

tfaddons documentation built on July 2, 2020, 2:12 a.m.