Home

Awesome

Generative Replay Inspired by Hippocampal Memory Indexing

Code for the paper "Generative Replay Inspired by Hippocampal Memory Indexing for Continual Language Learning" In The 17th Conference of the European Chapter of the Association for Computational Linguistics (EACL2023) by Aru Maekawa, Hidetaka Kamigaito, Kotaro Funakoshi, and Manabu Okumra.

This code is based on the open source code from "LAnguage-MOdeling-for-Lifelong-Language-Learning (LAMOL)". Most of the settings follow to theirs.

Examples

Pretraining:

./pretrain.sh

Training:

./train.sh --seq_train_type hmi-lamol --tasks sst srl woz.en

Test:

./test.sh --seq_train_type hmi-lamol --tasks sst srl woz.en

Acknowledgements

Citation

@inproceedings{maekawa-etal-2023-generative,
    title = "Generative Replay Inspired by Hippocampal Memory Indexing for Continual Language Learning",
    author = "Maekawa, Aru  and
              Kamigaito, Hidetaka  and
              Funakoshi, Kotaro  and
              Okumura, Manabu",
    booktitle = "Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics",
    month = may,
    year = "2023",
    address = "Dubrovnik, Croatia",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2023.eacl-main.65",
    pages = "930--942",
}

@inproceedings{
    sun2020lamal,
    title={{\{}LAMAL{\}}: {\{}LA{\}}nguage Modeling Is All You Need for Lifelong Language Learning},
    author={Fan-Keng Sun and Cheng-Hao Ho and Hung-Yi Lee},
    booktitle={International Conference on Learning Representations},
    year={2020},
    url={https://openreview.net/forum?id=Skgxcn4YDS}
}