finetuner.tailor.pytorch.projection_head module#

class finetuner.tailor.pytorch.projection_head.ProjectionHead(in_features, output_dim=128, num_layers=2)[source]#

Bases: torch.nn.Module

Projection head used internally for self-supervised training. It is (by default) a simple 3-layer MLP to be attached on top of embedding model only for training purpose. After training, it should be cut-out from the embedding model.

EPSILON = 1e-05#
forward(x)[source]#