Home

Awesome

T0_continual_learning

Adding new tasks to LLM without catastrophic forgetting

Paper:

https://arxiv.org/abs/2205.12393

Models:

https://huggingface.co/ThomasNLG/CT0-11B

Code:

We haven't cleaned up all the code yet, but most of the different steps can be found in this Collab.

In particular the notebook contains

For training, we plan to release the scripts. But you dont wait for it, we applied nothing fancy, simply finetuning T5 using the standard HF framework. All tehe parameters are mentioned in our paper.

Material:

All the material required in the notebook etc., including the training data, the predictions and the checkpoints are publicly available in main folders