Home

Awesome

Attention is not not Explanation

Code for the EMNLP 2019 paper Attention is not not Explanation by Wiegreffe & Pinter.

When using this codebase, please cite:

@inproceedings{wiegreffe-pinter-2019-attention,
    title = "Attention is not not Explanation",
    author = "Wiegreffe, Sarah  and
      Pinter, Yuval",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)",
    month = nov,
    year = "2019",
    address = "Hong Kong, China",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/D19-1002",
    doi = "10.18653/v1/D19-1002",
    pages = "11--20"
}

We've based our repository on the code provided by Sarthak Jain & Byron Wallace for their paper Attention is not Explanation.

Dependencies

Please refer to the installation instructions for the repository provided by Jain & Wallace. We use the same dependencies. Also, make sure to export the meta-directory into which you clone attention to your PYTHONPATH in order for the imports to work correctly. For example, if the path to the cloned directory is /home/users/attention/, then run export PYTHONPATH='/home/users'.

Data Preprocessing

Please perform the preprocessing instructions provided by Jain & Wallace here. We replicated these instructions for the Diabetes, Anemia, SST, IMDb, AgNews, and 20News datasets.

Running Baselines

We replicate the reported baselines in Jain & Wallace's paper (as reported in our paper in Table 2) by running the following commands:

Freezing the Attention Distribution (Section 3.1)

Running Random Seeds Experiments (Section 3.2)

Running BOWs Experiments (Section 3.3)

Running Adversarial Model Experiments (Section 4)