Awesome
Tuning Language Models by Proxy
This repository contains code for the paper Tuning Language Models by Proxy (2024). If you have any questions, please feel free to create a Github issue or reach out to the first author at alisaliu@cs.washington.edu.
Evaluation
You can download the evaluation data at zip file of our evaluation data. Our evaluation setup is largely borrowed from Tülu 2 (codebase at https://github.com/allenai/open-instruct), with slight modifications. To see examples of how evaluation scripts are run, see scripts/eval
.
Citation
@misc{liu-etal-2024-tuning,
title={Tuning Language Models by Proxy},
author={Alisa Liu and Xiaochuang Han and Yizhong Wang and Yulia Tsvetkov and Yejin Choi and Noah A. Smith},
year={2024},
eprint={2401.08565},
archivePrefix={arXiv},
url={https://arxiv.org/abs/2401.08565}
}