Home

Awesome

🧊 ICE for Continual IE

🚀 Introduction

This is the official repository for the paper "Teamwork Is Not Always Good: An Empirical Study of Classifier Drift in Class-incremental Information Extraction" (Findings of ACL'23). We highlight the contributions as follows:

🔧 Basic Requirements

transformers==4.18.0
torch==1.7.1
torchmeta==1.8.0
numpy==1.19.5
tqdm==4.62.3
pip install -r requirements.txt

💾 Data Preparation

./data/{DATASET_NAME}/{DATASET_SPLIT}.jsonl
python gen_pretrain_feature.py

The script will generate preprocessed files under the corresponding dataset directory. You can change the variable dataset inside to generate features for different datasets.

⚙️ Training & Evaluation

./scripts/run_main.sh

Please see the comment in the script for more details on the argument.

📚 Reference

Please consider citing our paper if find it useful or interesting.

@inproceedings{liu-etal-2023-teamwork,
    title = "Teamwork Is Not Always Good: An Empirical Study of Classifier Drift in Class-incremental Information Extraction",
    author = "Liu, Minqian  and
      Huang, Lifu",
    booktitle = "Findings of the 61st Annual Meeting of the Association for Computational Linguistics",
    month = july,
    year = "2023",
    address = "Toronto, Canada",
    publisher = "Association for Computational Linguistics"
}

Acknowledgement

Parts of the code in this repository are adopted from the work Incremental Prompting. We thank the members in VT NLP Lab for the constructive comments to this work.