Home

Awesome

RE-AGCN

This is the implementation of Dependency-driven Relation Extraction with Attentive Graph Convolutional Networks at ACL 2021.

You can e-mail Yuanhe Tian at yhtian@uw.edu, if you have any questions.

Citation

If you use or extend our work, please cite our paper at ACL 2021.

@inproceedings{tian-etal-2021-dependency,
    title = "Dependency-driven Relation Extraction with Attentive Graph Convolutional Networks",
    author = "Tian, Yuanhe and Chen, Guimin and Song, Yan and Wan, Xiang",
    booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
    month = aug,
    year = "2021",
    address = "Online",
    pages = "4458--4471",
}

Requirements

Our code works with the following environment.

Dataset

To obtain the data, you can go to data directory for details.

Downloading BERT

In our paper, we use BERT (paper) as the encoder.

For BERT, please download pre-trained BERT-Base and BERT-Large English from Google or from HuggingFace. If you download it from Google, you need to convert the model from TensorFlow version to PyTorch version.

Downloading our pre-trained RE-AGCN

For RE-AGCN, you can download the models we trained in our experiments from Google Drive.

Run on Sample Data

Run run_sample.sh to train a model on the small sample data under the sample_data directory.

Training and Testing

You can find the command lines to train and test models in run_train.sh and run_test.sh, respectively.

Here are some important parameters:

To-do List

You can leave comments in the Issues section, if you want us to implement any functions.