Home

Awesome

<img src="https://github.com/ntt-dkiku/route-explainer/assets/154794155/3ae1a5ea-0262-4120-ad05-0768ca9292f2" align="left" width="70px"> RouteExplainer: An Explanation Framework for Vehicle Routing Problem (PAKDD 2024)

<p align="center"> <a href="https://ntt-dkiku.github.io/xai-vrp/" target="_blank"><img src="https://img.shields.io/badge/project-page-blue"></a> <a href="https://arxiv.org/abs/2403.03585" target="_blank"><img src="https://img.shields.io/badge/arXiv-abs-red"></a> <a href="https://huggingface.co/spaces/oookiku/route-explainer" target="_blank"><img src="https://img.shields.io/badge/๐Ÿค—-demo-yellow"></a> <a href="https://pakdd2024.org/" target="_blank"><img src="https://img.shields.io/badge/PAKDD-2024-green"></a> </p> <p> <!-- [RouteExplainerIcon](https://github.com/ntt-dkiku/route-explainer/assets/154794155/3ae1a5ea-0262-4120-ad05-0768ca9292f2) --> <!-- ![RouteExplainerFrameWork](https://github.com/ntt-dkiku/route-explainer/assets/154794155/1f955ed1-e0c0-4875-bebb-b6b1c4f04df8) --> This repo is the official implementation of <a href="https://arxiv.org/abs/2403.03585" target="_blank">RouteExplainer: An Explanation Framework for Vehicle Routing Problem</a> (PAKDD 2024). RouteExplainer is the first explanation framework for Vehicle Routing Problem. It generates an explanation for the influence of a specific edge in a route with the counterfactual explanation framework. The explanation text is generated by an LLM (GPT-4). On <a href="https://huggingface.co/spaces/oookiku/route-explainer" target="_blank">Hugging Face Spaces</a>, we publish a demo of interactive tourist route generation, which is a promising application of RouteExplainer. Please try it out for yourself. </p> <img src="https://github.com/ntt-dkiku/route-explainer/assets/154794155/1f955ed1-e0c0-4875-bebb-b6b1c4f04df8">

๐Ÿ“ฆ Setup

We recommend using Docker to setup development environments. Please use the Dockerfile in this repository. In the following, all commands are supposed to be run inside of the Docker container.

docker build -t route_explainer/route_explainer:1.0 .

You can run code interactively in the container after launching the container by the following command (<> indicates a placeholder, which you should replace according to your settings). Please set the shm_size as large as possible because the continuous reuse of conventional solvers (e.g., Concorde and LKH) consumes a lot of shared memory. The continuous reuse is required when generating datasets and evaluating edge classifiers.

docker run -it --rm -v </path/to/clone/repo>:/workspace/app --name evrp-eps -p <host_port>:<container_port> --shm-size <large_size (e.g., 30g)> --gpus all route_explainer/route_explainer:1.0 bash

If you use LKH and Concorde, you need to install them by the following command. LKH and Concorde is required for reproducing experiments, but not for demo.

python install_solvers.py

๐Ÿ”ง Usage

Here, we describe the general usage of our code. See Reproducibility for the reproducibility of the experiments in our paper.

1. Generating synthetic data with labels

You can generate synthetic dataset by the following command. Here, the solver generatess routes for given VRP instances, and the classifier annotates the edges in the generated routes. Main options are as follows (check other options by the -h option):

parameteroptionsremarks
problemtsptw, pctsp, pctsptw, cvrp.
solvertsptw: ortools, lkh. pctsp: ortools. pctsptw: ortools. cvrp: ortools, lkh.available solvers depend on the problem
classifiertsptw, pctsp: ortools, lkh, concorde. pctsptw: ortools. cvrp: ortools, lkh.available classifiers depend on the problem
python generate_dataset.py --problem <problem> --num_nodes <num_nodes> --num_samples 128000 10000 10000  --solver <solver> --classifier <classifier> --output_dir data --random_seed 1234 --annotation --parallel

The dataset are saved with the name: <output_dir>/<problem>/<data_type>_<problem>_<num_nodes>nodes_<num_sample>samples_seed<random_seed>.pkl, where data_type is train, valid, or eval.

2. Training

You can train the edge classifier by the following command. If you want to use cpu, please set the gpu option to -1.

python train.py --problem <problem> --train_dataset_path <path/to/train_dataset> --valid_dataset_path <path/to/valid_dataset> --model_checkpoint_path checkpoints/<model_name> --gpu <gpu_number>

3. Evaluating the edge classifier

You can evaluate the edge classifier by the following command. The best checkpoint in model_dir is automatically selected. The parallel option here enables parallel dataset loading. Please ensure that the problem the model was trained on and the problem of the evaluation dataset are identical.

python eval_classifier.py --model_type nn --model_dir checkpoint/<model_name> --dataset_path <path/to/eval_dataset> --gpu <gpu_number>  --parallel

๐Ÿ’ฌ Explanation generation (demo)

Go to https://localhost:8888 (replace 8888 with your container_port) after launching the streamlit app by the following command. This is a standalone demo, so you may skip the above experiments and try this first.

streamlit run app.py --server.port <container_port>

We also publish this demo on Hugging Face Spaces, so you can easily try it <a href="https://huggingface.co/spaces/oookiku/route-explainer" target="_blank">there</a>.

๐Ÿ’ฝ Datasets and checkpoints (Work in progress)

Coming Soon!

๐Ÿงช Reproducibility (Work in progress)

<div id="rep"> <!-- Refer to [reproduce_experiments.ipynb](./reproduct_experiments.ipynb). --> Coming Soon! </div>

๐Ÿž Bug reports and questions

If you encounter a bug or have any questions, please post issues in this repo.

๐Ÿ“„ Licence

Our code is licenced by NTT. Basically, the use of our code is limitted to research purposes. See LICENSE for more details.

๐Ÿค Citation

If you find this work useful, please cite our paper as follows:

@article{dkiku2024routeexplainer,
  author = {Daisuke Kikuta and Hiroki Ikeuchi and Kengo Tajiri and Yuusuke Nakano},
  title = {RouteExplainer: An Explanation Framework for Vehicle Routing Problem},
  year = 2024,
  journal = {arXiv preprint arXiv:2403.03585}
  url = {https://arxiv.org/abs/2403.03585}
}