Welcome to Finetuner!#

Finetuner allows one to tune the weights of any deep neural network for better embeddings on search tasks. It accompanies Jina to deliver the last mile of performance for domain-specific neural search applications.

🎛 Designed for finetuning: a human-in-the-loop deep learning tool for leveling up your pretrained models in domain-specific neural search applications.

🔱 Powerful yet intuitive: all you need is finetuner.fit() - a one-liner that unlocks rich features such as siamese/triplet network, metric learning, self-supervised pretraining, layer pruning, weights freezing, dimensionality reduction.

⚛️ Framework-agnostic: promise an identical API & user experience on PyTorch, Tensorflow/Keras and PaddlePaddle deep learning backends.

🧈 DocArray integration: buttery smooth integration with DocArray, reducing the cost of context-switch between experiment and production.

Quick start#

  1. Make sure that you have Python 3.7+ installed on Linux/MacOS. You have one of PyTorch (>=1.9), Tensorflow (>=2.5) or PaddlePaddle installed.

    pip install finetuner
  2. In this example, we want to tune the embedding vectors from a ResNet18 on a customized Celeba dataset.Finetuner accepts docarray DocumentArray, so we load CelebA image into this format:

    from docarray import DocumentArray
    # please change the file path to your data path
    data = DocumentArray.from_files('img_align_celeba/*.jpg')
    def preproc(doc):
        return (
            doc.load_uri_to_image_tensor(224, 224)
            .set_image_tensor_channel_axis(-1, 0)
        )  # No need for changing channel axes line if you are using tf/keras
  3. Let’s write a model with any of the following frameworks:

    import torchvision
    resnet = torchvision.models.resnet18(pretrained=True)
    import tensorflow as tf
    resnet = tf.keras.applications.resnet18.ResNet18(weights='imagenet')
    import paddle
    resnet = paddle.vision.models.resnet18(pretrained=True)
  4. Now feed the model and Celeba data into Finetuner.

    import finetuner as ft
    tuned_model = ft.fit(
        input_size=(3, 224, 224),
        layer_name='adaptiveavgpool2d_67', # layer before fc as feature extractor

Now that you’re set up, let’s dive into more of how Finetuner works and improves the performance of your neural search apps.

Next steps#

Finetuner is extremely easy to learn: all you need is finetuner.fit()!


Join Us#

Finetuner is backed by Jina AI and licensed under Apache-2.0. We are actively hiring AI engineers, solution engineers to build the next neural search ecosystem in opensource.

Index Module Index