Home

Awesome

TuckER: Tensor Factorization for Knowledge Graph Completion

<p align="center"> <img src="https://raw.githubusercontent.com/ibalazevic/TuckER/master/tucker.png"/ width=400> </p>

This codebase contains PyTorch implementation of the paper:

TuckER: Tensor Factorization for Knowledge Graph Completion. Ivana Balažević, Carl Allen, and Timothy M. Hospedales. Empirical Methods in Natural Language Processing (EMNLP), 2019. [Paper]

TuckER: Tensor Factorization for Knowledge Graph Completion. Ivana Balažević, Carl Allen, and Timothy M. Hospedales. ICML Adaptive & Multitask Learning Workshop, 2019. [Short Paper]

Link Prediction Results

DatasetMRRHits@10Hits@3Hits@1
FB15k0.7950.8920.8330.741
WN180.9530.9580.9550.949
FB15k-2370.3580.5440.3940.266
WN18RR0.4700.5260.4820.443

Running a model

To run the model, execute the following command:

 CUDA_VISIBLE_DEVICES=0 python main.py --dataset FB15k-237 --num_iterations 500 --batch_size 128
                                       --lr 0.0005 --dr 1.0 --edim 200 --rdim 200 --input_dropout 0.3 
                                       --hidden_dropout1 0.4 --hidden_dropout2 0.5 --label_smoothing 0.1

Available datasets are:

FB15k-237
WN18RR
FB15k
WN18

To reproduce the results from the paper, use the following combinations of hyperparameters with batch_size=128:

datasetlrdredimrdiminput_dhidden_d1hidden_d2label_smoothing
FB15k0.0030.992002000.20.20.30.
WN180.0050.995200300.20.10.20.1
FB15k-2370.00051.02002000.30.40.50.1
WN18RR0.0031.0200300.20.20.30.1

Requirements

The codebase is implemented in Python 3.6.6. Required packages are:

numpy      1.15.1
pytorch    1.0.1

Citation

If you found this codebase useful, please cite:

@inproceedings{balazevic2019tucker,
title={TuckER: Tensor Factorization for Knowledge Graph Completion},
author={Bala\v{z}evi\'c, Ivana and Allen, Carl and Hospedales, Timothy M},
booktitle={Empirical Methods in Natural Language Processing},
year={2019}
}