Get a matrix of pairwise distances between the embedings

Return type

`Tensor`

Bases: `paddle.nn.Layer`, `finetuner.tuner.base.BaseLoss`[`paddle.Tensor`]

Base class for all paddle losses.

forward(embeddings, labels)[source]#
Return type

`Tensor`

Computes the loss for a siamese network.

The loss for a pair of objects equals

```is_sim * dist + (1 - is_sim) * max(0, margin - dist)
```

where `is_sim` equals 1 if the two objects are similar, and 0 if they are not similar. The `dist` refers to the distance between the two objects, and `margin` is a number to help bound the loss for dissimilar objects.

The final loss is the average over losses for all pairs given by the indices.

Initialize the loss instance

Parameters
• distance (`str`) – The type of distance to use, avalilable options are `"cosine"`, `"euclidean"` and `"sqeuclidean"`

• margin (`float`) – The margin to use in loss calculation

compute(embeddings, indices)[source]#

Compute the loss

Parameters
• embeddings (`Tensor`) – An `[N, d]` tensor of embeddings

• indices (`Tuple`[`Tensor`, `Tensor`, `Tensor`]) – A list of tuple indices and target, where each element in the list contains three elements: the indices of the two objects in the pair, and their similarity (which equals 1 if they are similar, and 0 if they are dissimilar)

Return type

`Tensor`

get_default_miner(is_session_dataset)[source]#

Get the default miner for this loss, given the dataset type

Return type

`Union`[`SiameseMiner`, `SiameseSessionMiner`]

Compute the loss for a triplet network.

The loss for a single triplet equals:

```max(dist_pos - dist_neg + margin, 0)
```

where `dist_pos` is the distance between the anchor embedding and positive embedding, `dist_neg` is the distance between the anchor and negative embedding, and `margin` represents a wedge between the desired anchor-negative and anchor-positive distances.

The final loss is the average over losses for all triplets given by the indices.

Initialize the loss instance

Parameters
• distance (`str`) – The type of distance to use, avalilable options are `"cosine"`, `"euclidean"` and `"sqeuclidean"`

• margin (`float`) – The margin to use in loss calculation

compute(embeddings, indices)[source]#

Compute the loss

Parameters
• embeddings (`Tensor`) – An `[N, d]` tensor of embeddings

• indices (`Tuple`[`Tensor`, `Tensor`, `Tensor`]) – A list of tuple indices, where each element in the list contains three elements: the index of anchor, positive match and negative match in the embeddings tensor

Return type

`Tensor`

get_default_miner(is_session_dataset)[source]#

Get the default miner for this loss, given the dataset type

Return type

`Union`[`SiameseMiner`, `SiameseSessionMiner`]

Bases: `paddle.nn.Layer`, `finetuner.tuner.base.BaseLoss`[`paddle.Tensor`]

Compute the NTXent (Normalized Temeprature Cross-Entropy) loss.

This loss function is a temperature-adjusted cross-entropy loss, as defined in the SimCLR paper <https://arxiv.org/abs/2002.05709>. It operates on batches where there are two views of each instance

Initialize the loss instance.

Parameters

temerature – The temperature parameter

forward(embeddings, labels)[source]#

Compute the loss.

Parameters
• embeddings (`Tensor`) – An `[N, d]` tensor of embeddings.

• labels (`Tensor`) – An `[N,]` tensor of item labels. It is expected that each label appears two times.

Return type

`Tensor`