Home

Awesome

pytorch implementation of EvoExplore

Paper: "Temporal knowledge graph representation learning with local and global evolutions"

In this paper, we propose a novel framework to learn representations for temporal knowledge graph via modelling its local and global structure co-evolution, and this repository contains the benchmark datasets used in the paper and the implementation of our proposed framework.

<p align="center"><img src="Evo.PNG"/></p>

We firstly regard the local structure evolution as the establishment process of relations between different entities, and propose a novel hierarchical-attention-based temporal point process to model the establishment of each relation. This helps our framework to capture the evolutionary nature of TKG from a local perspective. Secondly, we regard the global structure evolution as its time-evolving community partition, and design a soft modularity as the metric to model the community structure in TKG. By jointly maximizing the soft modularity in each timestamp, our framework captures the evolutionary nature of TKG from a global perspective. Finally, we employ a multi-task loss function to jointly optimize the above two parts, which allows EvoExplore to capture the co-evolutionary pattern between local and global structure evolutions. Experimental results demonstrate the superiority of our proposed method compared with existing baselines.

If you make use of this code in your work, please cite the following paper:

@inproceedings{Zhang2021EvoExplore,
  title={Temporal knowledge graph representation learning with local and global evolutions},
  author={Jiasheng Zhang, Yongpan Sheng, Shuang Liang and Jie Shao},
  booktitle={Knowledge-Based Systems},
  volume = {251},
  pages = {109234},
  year = {2022}

Contents

  1. Installation
  2. Train_and_Test
  3. Datasets
  4. Baselines
  5. Contact

Installation

Install the following packages:

pip install torch
pip install numpy

Install CUDA and cudnn. Then run:

pip install cutorch
pip install cunn
pip install cudnn

Then clone the repository::

git clone https://github.com/zjs123/EvoExplore.git

We use Python3 for data processing and our code is also written in Python3.

Train_and_Test

The user can use the following command to reproduce the reported results of our models, in which the train and the evaluation processes are performed automatically.

python Main.py -dataset ICEWS14

Some of the important available options include:

        '-hidden', default = 100, type = int, help ='dimension of the learned embedding'
	'-lr',  default = 0.001, type = float, help = 'Learning rate'
	'-ns', 	 default = 10,   	type=int, 	help='negative samples for training'
	'-dataset', 	 default = "ICEWS14",   	choice=["ICEWS14","ICEWS05","GDELT"], 	help='dataset used to train'
	'-numOfEpoch', 	default=300,	type=int	help='Train Epoches'

Datasets

There are four datasets used in our experiment:ICEWS14, ICEWS05-15, ICEWS18, and GDELT. facts of each datases are formed as "[subject entity, relation, object entity, time]". Each data folder has four files:

-train.txt, test.txt, valid.txt: the first column is index of subject entity, second column is index of relation, third column is index of object entity, fourth column is the happened time of fact.

-stat.txt: num of entites and num of relations

The detailed statistic of each dataset

DatasetsNum of EntityNum of RelationNum of TimeTrainValidTest
ICEWS14 (Alberto et al., 2018)7,12823036572,8268,9418,963
ICEWS05-15 (Alberto et al., 2018)10,4882514,07138,696246,27546,092
ICEWS18 (Zhen Han et al., 2020)23,033256304373,01845,99549,545
GDELT (Goel et al., 2018)500203662,735,685341,961341,961

Baselines

We use following public codes for baseline experiments.

BaselinesCodeEmbedding sizeBatch num
TransE (Bordes et al., 2013)Link100, 200100, 200
TTransE (Leblay et al., 2018)Link50, 100, 200100, 200
TA-TransE (Alberto et al., 2018)Link100, 200Default
HyTE (Dasgupta et al., 2018)LinkDefaultDefault
DE-DistMult (Goel et al., 2020)LinkDefaultDefault
TNTComplEX (Timothee et al., 2020)LinkDefaultDefault
ATiSE (Chenjin et al., 2020)LinkDefaultDefault

Contact

For any questions or suggestions you can use the issues section or contact us at shengyp2011@gmail.com or zjss12358@gmail.com.