Awesome
From News to Forecast: Integrating Event Analysis in LLM-based Time Series Forecasting with Reflection (NeurIPS 2024)
[Paper (arXiv)], [MIT Technology Review (China) Feature] <br> Xinlei Wang, Maike Feng, Jing Qiu, Jinjin Gu, Junhua Zhao <br> School of Electrical and Computer Engineering, The University of Sydney; <br> School of Science and Engineering, The Chinese University of Hong Kong, Shenzhen; <br> Shenzhen Institute of Artificial Intelligence and Robotics for Society <be>
This repository contains the code and dataset for our paper: "From News to Forecast: Integrating Event Analysis in LLM-Based Time Series Forecasting with Reflection", presented at NeurIPS 2024.
Abstract
This paper introduces a novel approach to enhance time series forecasting using Large Language Models (LLMs) and Generative Agents. Our method integrates real-world social events, extracted from news, with traditional time series data. This allows our model to respond to unexpected incidents and societal shifts, improving the accuracy of predictions.
The main components of our system are:
- LLM-based agents: Automatically filter and analyze relevant news for time series forecasting.
- Data preparation and fine-tuning: After pairing the selected news with the time series data, the model is fine-tuned to enhance forecasting accuracy further.
- Reasoning logic updates: Refine the selection of news and improve prediction accuracy iteratively.
Features
- Integration of unstructured news data into numerical time series forecasting.
- Iterative event reasoning through LLMs to continuously refine predictions.
- Application across multiple domains, including energy, exchange, bitcoin, and traffic forecasting.
Dataset
Overview
Our dataset is an important part of research covering multiple areas where time series forecasting can be enhanced by integrating real-world events and news data. The dataset includes structured numerical data and unstructured textual information, providing a unique combination of insights to achieve more accurate and adaptive forecasts. The model combines structured time series data with unstructured news data to improve forecast accuracy in various areas, including electricity demand, Bitcoin price, exchange rate, and traffic.
We provide:
- State-level half-hourly electricity load data provided by the Australian Energy Market Operator (AEMO) and supplementary information, covering the period from 2019 to 2022 (
data/raw_time_series_data/weather_load_2019-2022.csv
); - Time series data for daily exchange rate data with a focus on the Australian dollar and supplementary economic indicators between 2018 and 2022 (
data/raw_time_series_data/Exchang_all_data_2018-2022_D_final.csv
); - Time series data for daily Bitcoin price from 2019 to 2021 (
data/raw_time_series_data/bitcoin_daily.csv
); - Time series data for hourly traffic volume in each California road between 2015 and 2016 (
data/raw_time_series_data/traffic_hourly.csv
).
News Data
The News data is collected from a variety of sources, including:
- GDELT Project: A global database that monitors news media worldwide in real-time.
- Yahoo Finance: For financial news related to the exchange rate and Bitcoin price domains.
- News AU: News for Australian national or international Events.
We also enhance the dataset with supplementary information such as weather data (from OpenWeatherMap), calendar dates, and economic indicators to further enrich the context for forecasting.
The Australian news data used in the forecasting model for the energy and exchange domains is stored in the data/raw_news_data/AU-news
directory, and the raw news data related to the Bitcoin and traffic domains is stored in the data/raw_news_data
directory. Use cat data/raw_news_data/AU-news/news_processed_data_2019_part_* > data/raw_news_data/AU-news/news_processed_data_2019_merged.json
to merge the file parts.
The data/paired_time_series_news_training_data
directory contains the matched time series and news data used to fine-tune and test the model.
Program Details
Finetune LLM for Times Series Forecasting
-
Install anaconda sandbox environment:
conda create -n llm_news_ts python=3.9 -y conda activate llm_news_ts
-
Clone the repository and navigate to the project directory:
git clone https://github.com/ameliawong1996/From_News_to_Forecast.git cd From_News_to_Forecast
-
Install the necessary dependencies (make sure
requirements.txt
is provided with the needed packages):pip install -r requirements.txt
-
Varify bitsandbytes.
python -m bitsandbytes
-
Download Pre-trained Language Model.
python model_download.py --repo_id daryl149/llama-2-7b-chat-hf
-
Finetune LLM to predict time series. Replace the path
{}
in to your path. Fine-tuning the 7B language model requires 16GB of video memory or more (P100 or T4 or above) and one or more graphics cards. Our work uses a single A100 (40GB) for training. The training time is about one day.deepspeed --include localhost:0 --master_port 29000 llm-finetune.py \ --model_name_or_path {path_to_your_downloaded_model:daryl149/llama-2-7b-chat-hf} \ --tokenizer_name {path_to_your_downloaded_model:daryl149/llama-2-7b-chat-hf} \ --train_files {path_to_your_time_series_data;example:ts_data/AU_load_with_News_train.json} \ --validation_files {path_to_your_time_series_data;example:ts_data/AU_load_with_News_test.json} \ --per_device_train_batch_size 4 \ --per_device_eval_batch_size 4 \ --do_train \ --do_eval \ --use_fast_tokenizer true \ --output_dir {path_where_you_save_your_results} \ --evaluation_strategy steps \ --max_eval_samples 400 \ --learning_rate 1e-4 \ --gradient_accumulation_steps 4 \ --num_train_epochs 8 \ --warmup_steps 400 \ --load_in_bits 8 \ --lora_r 8 \ --lora_alpha 16 \ --target_modules q_proj,k_proj,v_proj,o_proj,down_proj,gate_proj,up_proj \ --logging_dir {path_where_you_save_your_results}/logs \ --logging_strategy steps \ --logging_steps 10 \ --save_strategy steps \ --preprocessing_num_workers 10 \ --save_steps 200 \ --eval_steps 200 \ --save_total_limit 2000 \ --seed 42 \ --disable_tqdm false \ --ddp_find_unused_parameters false \ --block_size 2048 \ --report_to tensorboard \ --overwrite_output_dir \ --ignore_data_skip true \ --gradient_checkpointing \ --ddp_timeout 18000000
finetune-lls4ts.sh
is a shell file that contain the training script. -
Test the trained LLM for time series prediction. This code will generate a new JSON file. The prompt is the same as the input file, and the output is the test result.
python validation.py \ --base_model {path_to_your_downloaded_model:daryl149/llama-2-7b-chat-hf} \ --lora_weights {path_to_your_checkpoint_dir;example:your_experiment/checkpoint-2000} \ --val_data {path_to_your_time_series_data;example:ts_data/AU_load_with_News_test.json} \ --prompter_name ts_test \ --save {path_to_your_time_series_test_results}
Use
Evaluation.ipynb
to calculate the Metrics regard to the generated time series JSON file. The notebook uses the following metrics to compare the prediction performances of different forecasting models:- MSE (Mean Squared Error)
- RMSE (Root Mean Square Error)
- MAE (Mean Absolute Error)
- MAPE (Mean Absolute Percentage Error)
Agent building
- In the
Agent/code/train_and_test.py
file, refer to the above training script configuration path. Replace[path_to_your_LLM:daryl149/llama-2-7b-chat-hf]
and[path_to_your_checkpoint]
with your own path. Then, you can use Python to start a new training or testing. - Open and run
Agent/code/AgentBuilding.ipynb
for the agent and reflection designs. The code calls the relevant data from theAgent/Data_all
directory.
Citation
If you find our research helpful, please cite our paper:
@inproceedings{wang2024newsforecast,
title={From News to Forecast: Integrating Event Analysis in LLM-Based Time Series Forecasting with Reflection},
author={Wang, Xinlei and Feng, Maike and Qiu, Jing and Gu, Jinjin and Zhao, Junhua},
booktitle={Neural Information Processing Systems},
year={2024}
}