Home

Awesome

Dynamic Relation-Attentive Graph Neural Networks for Fraud Detection

This code is the official implementation of the following paper:

Heehyeon Kim, Jinhyeok Choi, and Joyce Jiyoung Whang, Dynamic Relation-Attentive Graph Neural Networks for Fraud Detection, Machine Learning on Graphs (MLoG) Workshop at the 23rd IEEE International Conference on Data Mining (ICDM), 2023

All codes are written by Heehyeon Kim (heehyeon@kaist.ac.kr) and Jinhyeok Choi (cjh0507@kaist.ac.kr). When you use this code, please cite our paper.

@article{drag,
  author={Heehyeon Kim, Jinhyeok Choi, and Joyce Jiyoung Whang},
  booktitle = {2023 IEEE International Conference on Data Mining Workshops (ICDMW)},
  title = {Dynamic Relation-Attentive Graph Neural Networks for Fraud Detection},
  year = {2023},
  pages = {1092-1096}
}

Requirments

We used Python 3.8, Pytorch 1.12.1, and DGL 1.0.2 with cudatoolkit 11.3.

Usage

DRAG

We used NVIDIA RTX A6000 and NVIDIA GeForce RTX 3090 for all our experiments. We provide the template configuration file (template.json) for the YelpChi and Amazon_new datasets.

To train DRAG, use the run.py file as follows:

python run.py --exp_config_path=./template.json

Results will be printed in the terminal and saved in the directory designated by the configuration file.

Each run corresponds to an experiment ID f"{dataset_name}-{train_ratio}-{seed}-{time}".

You can find log files and pandas DataFrame pickle files associated with experiment IDs in the designated directory.

There are some useful functions to handle experiment results in utils.py.

You can find an example in performance_check.ipynb.

Training from Scratch

To train DRAG from scratch, run run.py with the configuration file. Please refer to model_handler.py, data_handler.py, and model.py for examples of the arguments in the configuration file.

The list of arguments of the configuration file:

Hyperparameters

We tuned DRAG with the following tuning ranges:

Description for each file