Home

Awesome

A schematic for Mamba4Cast's architecture identifying its 3 main components: (1) Input embeddings, (2) Encoder blocks, (3) Decoder

Mamba4Cast: Efficient Zero-Shot Time Series Forecasting with State Space Models

This repository contains the code for Mamba4Cast, a zero-shot foundational model for time series forecasting built on the Mamba architecture. Mamba4Cast predicts the entire forecast horizon in a single pass, offering efficient and scalable forecasting for real-life datasets without the need for task-specific fine-tuning. This work is based on the Mamba architecture.

Project Overview

Mamba4Cast introduces a novel approach to time series forecasting by leveraging synthetic data for training. Built on the Mamba architecture, this model generalizes effectively across diverse time series tasks. Unlike autoregressive models, which predict one step at a time, Mamba4Cast makes a single prediction for the entire forecast horizon, significantly improving computational efficiency.

Key Features

Installation

To install the necessary dependencies and set up the environment for Mamba4Cast, follow the steps below:

  1. Clone the repository:

    git clone https://github.com/automl/Mamba4Cast
    cd Mamba4Cast
    
  2. Create a virtual environment (optional but recommended) with python 3.10.13

  3. Activate the environment and run the following bash script: bash install_requirements.sh

  4. Create a directory models in the main directory of the project, download our model weights, and place the file in models.

Other requirements:

Example usage

You can refer to the Mamba4Cast_example.ipynb notebook to have results resembling the sine-wave figure in the paper.