Home

Awesome

<p align="center"> <a href="https://github.com/zjukg/NeuralKG/tree/main"> <img src="pics/neuralkg2.png" height="100"/></a> <a href="https://github.com/zjukg/NeuralKG/tree/ind"> <img src="pics/neuralkg-ind2.png" height="100"/></a> <p> <p align="center"> <a href="http://neuralkg.zjukg.cn/"> <img alt="Website" src="https://img.shields.io/badge/website-online-orange"> </a> <a href="https://pypi.org/project/neuralkg/"> <img alt="Pypi" src="https://img.shields.io/pypi/v/neuralkg"> </a> <a href="https://github.com/zjukg/NeuralKG/blob/main/LICENSE"> <img alt="Pypi" src="https://img.shields.io/badge/license-Apache--2.0-yellowgreen"> </a> <!-- <a href=""> <img alt="LICENSE" src="https://img.shields.io/badge/license-MIT-brightgreen"> </a> --> <a href="https://zjukg.github.io/NeuralKG/index.html"> <img alt="Documentation" src="https://img.shields.io/badge/Doc-online-blue"> </a> </p> <h1 align="center"> <p>An Open Source Library for Diverse Representation Learning of Knowledge Graphs</p> </h1> <p align="center"> <b> English | <a href="https://github.com/zjukg/NeuralKG/blob/main/README_CN.md">中文</a> </b> </p>

NeuralKG is a python-based library for diverse representation learning of knowledge graphs implementing Conventional KGEs, GNN-based KGEs, and Rule-based KGEs. We provide comprehensive documents for beginners and an online website to organize an open and shared KG representation learning community.

<br>

Table of Contents

<!-- * [To do](#to-do) --> <br>

😃What's New

<br>

Overview

<h3 align="center"> <img src="pics/overview.png", width="600"> </h3> <!-- <p align="center"> <a href=""> <img src="pics/overview.png" width="400"/></a> <p> -->

NeuralKG is built on PyTorch Lightning. It provides a general workflow of diverse representation learning on KGs and is highly modularized, supporting three series of KGEs. It has the following features:

<br>

Demo

There is a demonstration of NeuralKG.

<!-- ![框架](./pics/demo.gif) --> <img src="pics/demo.gif"> <!-- <img src="pics/demo.gif" width="900" height="476" align=center> --> <br>

Implemented KGEs

ComponentsModels
KGEModelTransE, TransH, TransR, ComplEx, DistMult, RotatE, ConvE, BoxE, CrossE, SimplE, HAKE, PairRE, DualE
GNNModelRGCN, KBAT, CompGCN, XTransE, SEGNN
RuleModelComplEx-NNE+AER, RUGE, IterE
<br>

Quick Start

Installation

Step1 Create a virtual environment using Anaconda and enter it

conda create -n neuralkg python=3.8
conda activate neuralkg

Step2 Install the appropriate PyTorch and DGL according to your cuda version

Here we give a sample installation based on cuda == 11.1

pip install torch==1.9.1+cu111 -f https://download.pytorch.org/whl/torch_stable.html
pip install dgl-cu111 dglgo -f https://data.dgl.ai/wheels/repo.html

Step3 Install package

pip install neuralkg
git clone https://github.com/zjukg/NeuralKG.git
cd NeuralKG
python setup.py install

Training

# Use bash script
sh ./scripts/your-sh

# Use config
python main.py --load_config --config_path <your-config>

Evaluation

python main.py --test_only --checkpoint_dir <your-model-path>

Hyperparameter Tuning

NeuralKG utilizes Weights&Biases supporting various forms of hyperparameter optimization such as grid search, Random search, and Bayesian optimization. The search type and search space are specified in the configuration file in the format "*.yaml" to perform hyperparameter optimization.

The following config file displays hyperparameter optimization of the TransE on the FB15K-237 dataset using bayes search:

command:
  - ${env}
  - ${interpreter}
  - ${program}
  - ${args}
program: main.py
method: bayes
metric:
  goal: maximize
  name: Eval|hits@10
parameters:
  dataset_name:
    value: FB15K237
  model_name:
    value: TransE
  loss_name:
    values: [Adv_Loss, Margin_Loss]
  train_sampler_class:
    values: [UniSampler, BernSampler]
  emb_dim:
    values: [400, 600]
  lr:
    values: [1e-4, 5e-5, 1e-6]
  train_bs:
    values: [1024, 512]
  num_neg:
    values: [128, 256]
<br>

Reproduced Results

There are some reproduced model results on FB15K-237 dataset using NeuralKG as below. See more results in here

MethodMRRHit@1Hit@3Hit@10
TransE0.320.230.360.51
TransR0.230.160.260.38
TransH0.310.20.340.50
DistMult0.300.220.330.48
ComplEx0.250.170.270.40
SimplE0.160.090.170.29
ConvE0.320.230.350.50
RotatE0.330.230.370.53
BoxE0.320.220.360.52
HAKE0.340.240.380.54
PairRE0.350.250.380.54
DualE0.330.240.360.52
XTransE0.290.190.310.45
RGCN0.250.160.270.43
KBAT*0.280.180.310.46
CompGCN0.340.250.380.52
SEGNN0.360.270.390.54
IterE0.260.190.290.41

*:There is a label leakage error in KBAT, so the corrected result is poor compared with the paper result. Details in https://github.com/deepakn97/relationPrediction/issues/28

<br>

Notebook Guide

😃We use colab to provide some notebooks to help users use our library.

Colab Notebook

<br>

Detailed Documentation

https://zjukg.github.io/NeuralKG/neuralkg.html

<!-- <br> --> <!-- # To do --> <br>

Citation

Please cite our paper if you use NeuralKG in your work

@inproceedings{neuralkg,
  author    = {Wen Zhang and
               Xiangnan Chen and
               Zhen Yao and
               Mingyang Chen and
               Yushan Zhu and
               Hongtao Yu and
               Yufeng Huang and
               Yajing Xu and
               Ningyu Zhang and
               Zezhong Xu and
               Zonggang Yuan and
               Feiyu Xiong and
               Huajun Chen},
  title     = {NeuralKG: An Open Source Library for Diverse Representation Learning
               of Knowledge Graphs},
  booktitle = {{SIGIR}},
  pages     = {3323--3328},
  publisher = {{ACM}},
  year      = {2022}
}

<br>

NeuralKG Core Team

Wen Zhang, Xiangnan Chen, Zhen Yao, Mingyang Chen, Yushan Zhu, Hongtao Yu, Yufeng Huang, Zezhong Xu, Yajing Xu, Peng Ye, Yichi Zhang, Ningyu Zhang, Guozhou Zheng, Haofen Wang, Huajun Chen