Skip to main content

Optimizers

Static Optimizer

A static (or offline) optimizer adjusts prompt templates and model parameters using algorithms before inference time. This process is exportable and aiming to improve the function’s overall performance and not touch the prompt during the inference. A Static optimizer accepts the following parameters:
  • function (int): The id of the function this optimizer is for.
  • external_id (str): A unique identifier for the optimizer that can be used to call and retrieve it.
  • train_set (int): The id of the train dataset this optimizer is trained on.
  • validation_set (int): The id of the validation dataset this optimizer is validated on.
  • test_set (int): The id of the test dataset this optimizer is evaluated on.
  • magical_set (int): The id of the magical dataset, if any of the above datasets is missing this will be used.
  • parameters (dict): The parameters for the optimizer with the following keys:
    • shots (int): The number of shots to use for the optimizer.
    • samples (int): The number of samples to use for the optimizer.
    • model_keywords (dict): The model keywords for the optimizer with the following keys:
      • temperature (float): The temperature for the optimizer
  • schedule (dict): The schedule for the optimizer with the following keys:
    • cron (str): The cron schedule for the optimizer
  • api_key (str): The API key for the optimizer
  • model (str): The name of the model to use for the optimizer
  • base_url (str): Optional base URL for custom LiteLLM server deployments
  • optimizer_type (str): The type of optimizer to use, which for the static optimizer is always fewshot.

Example

Example of how to create a static optimizer:
import requests
import json

BASE_URL = "https://orch.zenbase.ai/api"
API_KEY = "YOUR ZENBASE API KEY"

def api_call(method, endpoint, data=None):
    url = f"{BASE_URL}/{endpoint}"
    headers = {
        "Content-Type": "application/json",
        "Authorization": f"Api-Key {API_KEY}"
    }
    response = requests.request(method, url, headers=headers, data=json.dumps(data) if data else None)
    return response


optimizer_data = {
    "function": function_id,
    "external_id": "my-static-optimizer-1",
    "train_set": train_dataset_id,
    "validation_set": validation_dataset_id,
    "test_set": test_dataset_id,
    "parameters": {
        "shots": 5,
        "samples": 5,
        "model_keywords": {
            "temperature": 0.5
        },
    },
    "schedule": {
        "cron": "*/5 * * * *"  # Run daily at midnight
    },
    "api_key": API_KEY,
    "model": "MODEL_NAME",  # For example gpt-4o-mini
    "optimizer_type": "fewshot",
}

optimizer = api_call("POST", "optimizer-configurations/", optimizer_data)
optimizer_id = optimizer.json()['id']

Dynamic Optimizer

A dynamic (or online) optimizer adjusts prompt templates and model parameters for a specific task using both static and dynamic techniques. It uses input data to optimize the function’s parameters and enhance performance during inference time. A Dynamic Optimizer accepts the following parameters:
  • function (int): The id of the function this optimizer is for.
  • external_id (str): A unique identifier for the optimizer that can be used to call and retrieve it.
  • training_set (int): The id of the training dataset this optimizer is trained on.
  • validation_set (int): The id of the validation dataset this optimizer is validated on.
  • test_set (int): The id of the test dataset this optimizer is evaluated on.
  • magical_set (int): The id of the magical dataset, if any of the above datasets is missing this will be used.
  • parameters (dict): The parameters for the optimizer with the following keys:
    • shots (int): The number of shots to use for the optimizer.
    • model_keywords (dict): The model keywords for the optimizer with the following keys:
      • temperature (float): The temperature for the optimizer
  • schedule (dict): The schedule for the optimizer with the following keys:
    • cron (str): The cron schedule for the optimizer
  • api_key (str): The API key for the optimizer
  • model (str): The name of the model to use for the optimizer
  • base_url (str): Optional base URL for custom LiteLLM server deployments
  • optimizer_type (str): The type of optimizer to use, which for the dynamic optimizer is always dynamic_fewshot.

Example

Example of how to create a dynamic optimizer:
import requests
import json

BASE_URL = "https://orch.zenbase.ai/api"
API_KEY = "YOUR ZENBASE API KEY"

def api_call(method, endpoint, data=None):
    url = f"{BASE_URL}/{endpoint}"
    headers = {
        "Content-Type": "application/json",
        "Authorization": f"Api-Key {API_KEY}"
    }
    response = requests.request(method, url, headers=headers, data=json.dumps(data) if data else None)
    return response


optimizer_data = {
    "function": function_id,
    "external_id": "my-dynamic-optimizer-1",
    "training_set": training_set_id,
    "validation_set": validation_set_id,
    "test_set": test_set_id,
    "parameters": {
        "shots": 5,
        "model_keywords": {
            "temperature": 0.5
        },
    },
    "schedule": {
        "cron": "*/5 * * * *"  # Run daily at midnight
    },
    "api_key": API_KEY,
    "model": "gpt-4o-mini",
    "optimizer_type": "dynamic_fewshot",
}
optimizer = api_call("POST", "optimizer-configurations/", optimizer_data)
optimizer_id = optimizer.json()['id']

Finetune Optimizer

A finetune optimizer can be static or dynamic optimizer. It will finetune a model to optimize the performance of the function. A finetune optimizer accepts the following parameters:
  • function (int): The id of the function this optimizer is for
  • external_id (str): A unique identifier for the optimizer that can be used to call and retrieve it.
  • train_set (int): The id of the train dataset this optimizer is trained on
  • validation_set (int): The id of the validation dataset this optimizer is validated on
  • test_set (int): The id of the test dataset this optimizer is evaluated on
  • magical_set (int): The id of the magical dataset, if any of the above datasets is missing this will be used
  • parameters (dict): The parameters for the optimizer with the following keys:
    • shots (int): The number of shots to use for the optimizer
    • samples (int): The number of samples to use for the optimizer (only for static optimizer)
    • hyperparameter (dict): The hyperparameters for fine-tuning the model with the following keys:
      • n_epochs (int): The number of epochs for fine-tuning
    • model_keywords (dict): The model keywords for the optimizer with the following keys:
      • temperature (float): The temperature for the optimizer
  • schedule (dict): The schedule for the optimizer with the following keys:
    • cron (str): The cron schedule for the optimizer
  • api_key (str): The API key for the optimizer
  • model (str): The name of the model to use for the optimizer
  • base_url (str): The base URL for the optimizer
  • base_fine_tuning_model (str): The base model for fine-tuning the optimizer, for now we support the openai models. you can find the list of models here
  • optimizer_type (str): The type of optimizer to use, currently we support fewshot and dynamic_fewshot

Example

Example of how to create a finetune optimizer:
import requests
import json

BASE_URL = "https://orch.zenbase.ai/api"
API_KEY = "YOUR ZENBASE API KEY"

def api_call(method, endpoint, data=None):
    url = f"{BASE_URL}/{endpoint}"
    headers = {
        "Content-Type": "application/json",
        "Authorization": f"Api-Key {API_KEY}"
    }
    response = requests.request(method, url, headers=headers, data=json.dumps(data) if data else None)
    return response


optimizer_data = {
    "function": function_id,
    "external_id": "my-finetune-optimizer-1",
    "train_set": train_dataset_id,
    "validation_set": validation_dataset_id,
    "test_set": test_dataset_id,
    "parameters": {
        "shots": 5,
        "samples": 5,
        "hyperparameter": {
            "n_epochs": 1,
        },
    },
    "schedule": {
        "cron": "*/5 * * * *"  # Run daily at midnight
    },
    "api_key": API_KEY,
    "model": "gpt-4o-mini",
    "base_fine_tuning_model": "gpt-4o-mini-2024-07-18",
    "optimizer_type": "dynamic_fewshot",
}
optimizer = api_call("POST", "optimizer-configurations/", optimizer_data)
optimizer_id = optimizer.json()['id']