Home

Awesome

MKGFormer

Code for the SIGIR 2022 paper "Hybrid Transformer with Multi-level Fusion for Multimodal Knowledge Graph Completion"

Model Architecture

<div align=center> <img src="resource/model.png" width="75%" height="75%" /> </div>

Illustration of MKGformer for (a) Unified Multimodal KGC Framework and (b) Detailed M-Encoder.

Requirements

To run the codes, you need to install the requirements:

pip install -r requirements.txt

Data Collection

The datasets that we used in our experiments are as follows:

The expected structure of files is:

MKGFormer
 |-- MKG	# Multimodal Knowledge Graph
 |    |-- dataset       # task data
 |    |-- data          # data process file
 |    |-- lit_models    # lightning model
 |    |-- models        # mkg model
 |    |-- scripts       # running script
 |    |-- main.py   
 |-- MNER	# Multimodal Named Entity Recognition
 |    |-- data          # task data
 |    |-- models        # mner model
 |    |-- modules       # running script
 |    |-- processor     # data process file
 |    |-- utils
 |    |-- run_mner.sh
 |    |-- run.py
 |-- MRE    # Multimodal Relation Extraction
 |    |-- data          # task data
 |    |-- models        # mre model
 |    |-- modules       # running script
 |    |-- processor     # data process file
 |    |-- run_mre.sh
 |    |-- run.py

How to run

Acknowledgement

The acquisition of image data for the multimodal link prediction task refer to the code from https://github.com/wangmengsd/RSME, many thanks.

Papers for the Project & How to Cite

If you use or extend our work, please cite the paper as follows:

@article{DBLP:journals/corr/abs-2205-02357,
  author    = {Xiang Chen and
               Ningyu Zhang and
               Lei Li and
               Shumin Deng and
               Chuanqi Tan and
               Changliang Xu and
               Fei Huang and
               Luo Si and
               Huajun Chen},
  title     = {Hybrid Transformer with Multi-level Fusion for Multimodal Knowledge
               Graph Completion},
  journal   = {CoRR},
  volume    = {abs/2205.02357},
  year      = {2022},
  url       = {https://doi.org/10.48550/arXiv.2205.02357},
  doi       = {10.48550/arXiv.2205.02357},
  eprinttype = {arXiv},
  eprint    = {2205.02357},
  timestamp = {Wed, 11 May 2022 17:29:40 +0200},
  biburl    = {https://dblp.org/rec/journals/corr/abs-2205-02357.bib},
  bibsource = {dblp computer science bibliography, https://dblp.org}
}