Reusing previously fine-tuned TimeGPT models can help reduce computation time and costs while maintaining or improving forecast accuracy.
This guide walks you through the steps to save, fine-tune, list, and delete your TimeGPT models effectively.

1

1. Import packages

First, import all necessary Python packages and initialize the Nixtla client.

Import Packages and Initialize Client
import pandas as pd
from nixtla import NixtlaClient
from utilsforecast.losses import rmse
from utilsforecast.evaluation import evaluate
Initialize Nixtla Client
nixtla_client = NixtlaClient(
    # defaults to os.environ["NIXTLA_API_KEY"]
    api_key='my_api_key_provided_by_nixtla'
)
2

2. Load data

Load the forecasting dataset and prepare the train/validation split.

Load and Split Dataset
df = pd.read_parquet('https://datasets-nixtla.s3.amazonaws.com/m4-hourly.parquet')

h = 48

valid = df.groupby('unique_id', observed=True).tail(h)
train = df.drop(valid.index)

train.head()

Below is an example of how the data looks:

unique_iddsy
0H11605.0
1H12586.0
2H13586.0
3H14559.0
4H15511.0
3

3. Zero-shot forecast

Generate a quick forecast without fine-tuning to serve as your baseline.

Zero-shot Forecast and Evaluation
fcst_kwargs = {
    'df': train,
    'freq': 1,
    'model': 'timegpt-1-long-horizon'
}

fcst = nixtla_client.forecast(h=h, **fcst_kwargs)

zero_shot_eval = evaluate(fcst.merge(valid), metrics=[rmse], agg_fn='mean')
zero_shot_eval

The specified horizon (h=48) exceeds the default optimal model horizon.
Shorter horizons can often yield more accurate forecasts.

Baseline evaluation result:

metricTimeGPT
rmse1504.474342
4

4. Fine-tune the model

5

5. Further fine-tune the model

You can fine-tune an already fine-tuned model to improve performance even more:

Further Fine-tuning
second_model_id = nixtla_client.finetune(
    finetuned_model_id=first_model_id,
    finetune_depth=3,
    **fcst_kwargs
)

second_model_id

Use the newly obtained model identifier to forecast:

Forecast using Further Fine-tuned Model
second_finetune_fcst = nixtla_client.forecast(
    h=h,
    finetuned_model_id=second_model_id,
    **fcst_kwargs
)

second_finetune_eval = evaluate(
    second_finetune_fcst.merge(valid),
    metrics=[rmse],
    agg_fn='mean'
)

first_finetune_eval.merge(
    second_finetune_eval,
    on=['metric'],
    suffixes=('_first_finetune', '_second_finetune')
)

Updated evaluation results:

metricTimeGPT_first_finetuneTimeGPT_second_finetune
rmse1472.0246191435.365211

Fine-tuning on top of your already fine-tuned model can yield consistent improvements in forecast quality.

6

6. List fine-tuned models

You can view a list of all your fine-tuned models:

List Fine-tuned Models
nixtla_client.finetuned_models(as_df=True)

idcreated_atcreated_bybase_model_idstepsdepthlossmodelfreq
468b13fb-4b26-447a-bd87-87a64b50d9132024-12-30 17:57:31.241455+00:00usermy-first-finetuned-model103defaulttimegpt-1-long-horizonMS
my-first-finetuned-model2024-12-30 17:57:16.978907+00:00userNone101defaulttimegpt-1-long-horizonMS
7

7. Delete fine-tuned models

When you no longer need a model, you can delete it to keep your workspace tidy:

Delete Fine-tuned Model and List Updated Models
nixtla_client.delete_finetuned_model(first_model_id)

nixtla_client.finetuned_models(as_df=True)

Deleting a fine-tuned model is irreversible. Make sure to back up any necessary information before removal.

Congratulations! You have successfully learned how to save, refine, and manage your fine-tuned TimeGPT models.
This workflow helps optimize your forecasting pipelines by leveraging previously generated insights.