Home

Awesome

Child-Tuning

Source code for EMNLP 2021 Long paper: Raise a Child in Large Language Model: Towards Effective and Generalizable Fine-tuning.

1. Environments

2. Dependencies

3. Training and Evaluation

>> bash run.sh

You can change the setting in this script.

4. Citation

If you use this work or code, please kindly cite the following paper:

@inproceedings{xu-etal-2021-childtuning,
    title = "Raise a Child in Large Language Model: Towards Effective and Generalizable Fine-tuning",
    author = "Runxin Xu and
    Fuli Luo and Zhiyuan Zhang and
    Chuanqi Tan and Baobao Chang and
    Songfang Huang and Fei Huang",
    booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
    year = "2021",
    publisher = "Association for Computational Linguistics",
}