# Welcome to Finetuner!¶

Finetuner allows one to tune the weights of any deep neural network for better embeddings on search tasks. It accompanies Jina to deliver the last mile of performance for domain-specific neural search applications.

🎛 Designed for finetuning: a human-in-the-loop deep learning tool for leveling up your pretrained models in domain-specific neural search applications.

🔱 Powerful yet intuitive: all you need is finetuner.fit() - a one-liner that unlocks rich features such as siamese/triplet network, interactive labeling, layer pruning, weights freezing, dimensionality reduction.

⚛️ Framework-agnostic: promise an identical API & user experience on PyTorch, Tensorflow/Keras and PaddlePaddle deep learning backends.

🧈 Jina integration: buttery smooth integration with Jina, reducing the cost of context-switch between experiment and production.

## Quick start¶

1. Make sure that you have Python 3.7+ installed on Linux/MacOS. You have one of PyTorch (>=1.9), Tensorflow (>=2.5) or PaddlePaddle installed.

pip install finetuner

2. In this example, we want to tune the 32-dim embedding vectors from a 2-layer MLP on the Fashion-MNIST dataset. Let’s write a model with any of the following frameworks:

import torch

embed_model = torch.nn.Sequential(
torch.nn.Flatten(),
torch.nn.Linear(
in_features=28 * 28,
out_features=128,
),
torch.nn.ReLU(),
torch.nn.Linear(in_features=128, out_features=32))

import tensorflow as tf

embed_model = tf.keras.Sequential([
tf.keras.layers.Flatten(input_shape=(28, 28)),
tf.keras.layers.Dense(128, activation='relu'),
tf.keras.layers.Dense(32)])

import paddle

in_features=28 * 28,
out_features=128,
),

3. Now feed the model and Fashion-MNIST data into Finetuner.

import finetuner
from finetuner.toydata import generate_fashion

finetuner.fit(
embed_model,
generate_fashion,
interactive=True)

4. You can now label the data in an interactive way. The model will get tuned and improved as you are labeling.

Now that you’re set up, let’s dive into more of how Finetuner works and improves the performance of your neural search apps.

## Next steps¶

Finetuner is extremely easy to learn: all you need is finetuner.fit()!

Answer the questions below and quickly find out what you need to learn:

Do you have an embedding model?
Do you have labeled data?
Finetuner usage 1

Perfect! Now embed_model and train_data are already provided by you, simply do:

import finetuner

tuned_model, summary = finetuner.fit(
embed_model,
train_data=train_data
)

Finetuner usage 2

You have an embed_model to use, but no labeled data for fine-tuning this model. No worries, you can use Finetuner to interactively label data and train embed_model as follows:

import finetuner

tuned_model, summary = finetuner.fit(
embed_model,
train_data=unlabeled_data,
interactive=True
)

Finetuner usage 3

You have a general_model but it does not output embeddings. Luckily, you’ve got some labeled_data for training. No worries, Finetuner can convert your model into an embedding model and train it via:

import finetuner

tuned_model, summary = finetuner.fit(
general_model,
train_data=labeled_data,
to_embedding_model=True,
output_dim=100
)

Finetuner usage 4

You have a general_model which is not for embeddings. Meanwhile, you don’t have any labeled data for training. But no worries, Finetuner can help you train an embedding model with interactive labeling on-the-fly:

import finetuner

tuned_model, summary = finetuner.fit(
general_model,
train_data=labeled_data,
interactive=True,
to_embedding_model=True,
output_dim=100
)