Ray
Distribute TimeGPT forecasting jobs on Ray for scalable Python workloads.
TimeGPT on Ray
Ray is an open-source unified compute framework that helps scale Python workloads for distributed computing. In this tutorial, you will learn how to distribute TimeGPT forecasting jobs on top of Ray.
This guide uses Fugue to easily run code across various distributed computing frameworks, including Ray.
Overview
Below is an outline of what we’ll cover:
- Installation
- Load Your Data
- Initialize Ray
- Use TimeGPT on Ray
- Shutdown Ray
1. Installation
Install Ray using Fugue. Fugue provides an easy-to-use interface for distributed computation. It lets you run Python code on several distributed computing frameworks, including Ray.
When executing on a distributed Ray cluster, ensure the nixtla
library is installed on all workers.
2. Load Your Data
Load your dataset into a pandas DataFrame. This tutorial uses hourly electricity prices from various markets:
Preview of the first few rows of data
3. Initialize Ray
Here, we’re spinning up a Ray cluster locally by creating a head node. You can scale this to multiple machines in a real cluster environment.
4. Use TimeGPT on Ray
With Ray, you can run TimeGPT similar to a standard (non-distributed) local environment. Operations such as forecast
still apply directly to Ray Dataset objects.
5. Shutdown Ray
Always shut down Ray after you finish your tasks to free up resources.
Congratulations! You’ve successfully used TimeGPT on Ray for distributed forecasting.